From 635b038e9e31e053a3509a389b2b7bfe6086cda1 Mon Sep 17 00:00:00 2001 From: 0xlukem Date: Fri, 24 Oct 2025 16:33:30 -0300 Subject: [PATCH 1/3] remove hr from manual cards --- develop/interoperability/index.md | 4 ---- develop/interoperability/xcm-guides/index.md | 4 ---- develop/parachains/customize-parachain/index.md | 3 --- develop/parachains/deployment/index.md | 1 - develop/parachains/maintenance/index.md | 2 -- develop/parachains/testing/index.md | 2 -- develop/toolkit/api-libraries/index.md | 2 -- develop/toolkit/parachains/fork-chains/chopsticks/index.md | 2 -- develop/toolkit/parachains/fork-chains/index.md | 1 - develop/toolkit/parachains/spawn-chains/index.md | 1 - develop/toolkit/parachains/spawn-chains/zombienet/index.md | 1 - infrastructure/running-a-validator/index.md | 3 --- .../running-a-validator/onboarding-and-offboarding/index.md | 4 ---- .../running-a-validator/operational-tasks/index.md | 2 -- infrastructure/staking-mechanics/index.md | 3 --- tutorials/dapps/index.md | 2 -- tutorials/index.md | 5 ----- tutorials/interoperability/index.md | 2 -- tutorials/interoperability/xcm-channels/index.md | 1 - tutorials/onchain-governance/index.md | 1 - tutorials/polkadot-sdk/index.md | 1 - tutorials/polkadot-sdk/system-chains/asset-hub/index.md | 1 - 22 files changed, 48 deletions(-) diff --git a/develop/interoperability/index.md b/develop/interoperability/index.md index bda85fcce..da17828bd 100644 --- a/develop/interoperability/index.md +++ b/develop/interoperability/index.md @@ -21,28 +21,24 @@ This section covers everything you need to know about building and implementing

Review the Polkadot SDK's XCM Documentation

-

Dive into the official documentation to learn about the key components for supporting XCM in your parachain and enabling seamless cross-chain communication.

Follow Step-by-Step Tutorials

-

Enhance your XCM skills with step-by-step tutorials on building interoperability solutions on Polkadot SDK-based blockchains.

Familiarize Yourself with the XCM Format

-

Gain a deeper understanding of the XCM format and structure, including any extra data it may need and what each part of a message means.

Essential XCM Tools

-

Explore essential tools for creating and integrating cross-chain solutions within the Polkadot ecosystem.

diff --git a/develop/interoperability/xcm-guides/index.md b/develop/interoperability/xcm-guides/index.md index dc0c16157..38cf9f3ef 100644 --- a/develop/interoperability/xcm-guides/index.md +++ b/develop/interoperability/xcm-guides/index.md @@ -20,28 +20,24 @@ Whether you're building applications that need to interact with multiple chains,

Send XCM Messages

-

Learn the fundamentals of sending cross-chain messages using XCM, including message structure, routing, and execution patterns.

XCM Configuration

-

Learn how to configure XCM for your chain.

Test and Debug

-

Learn how to test and debug cross-chain communication via the XCM Emulator to ensure interoperability and reliable execution.

XCM Channels

-

Learn how to configure XCM channels for your chain.

diff --git a/develop/parachains/customize-parachain/index.md b/develop/parachains/customize-parachain/index.md index 1c4c41eaf..41d903493 100644 --- a/develop/parachains/customize-parachain/index.md +++ b/develop/parachains/customize-parachain/index.md @@ -20,21 +20,18 @@ The [FRAME directory](https://github.com/paritytech/polkadot-sdk/tree/{{dependen

FRAME Repository

-

View the source code of the FRAME development environment that provides pallets you can use, modify, and extend to build the runtime logic to suit the needs of your blockchain.

FRAME Rust docs

-

Check out the Rust documentation for FRAME, Substrate's preferred framework for building runtimes.

Polkadot SDK Best Practices

-

Understand and address common issues that can arise in blockchain development when building with the Polkadot SDK.

diff --git a/develop/parachains/deployment/index.md b/develop/parachains/deployment/index.md index ec45f2e10..a6d6d500a 100644 --- a/develop/parachains/deployment/index.md +++ b/develop/parachains/deployment/index.md @@ -71,7 +71,6 @@ flowchart TD

Check Out the Chain Spec Builder Docs

-

Learn about Substrate’s chain spec builder utility.

diff --git a/develop/parachains/maintenance/index.md b/develop/parachains/maintenance/index.md index 0a135999a..4eaaf37f3 100644 --- a/develop/parachains/maintenance/index.md +++ b/develop/parachains/maintenance/index.md @@ -18,14 +18,12 @@ Learn how to maintain Polkadot SDK-based networks, focusing on runtime monitorin

Single Block Migration Example

-

Check out an example pallet demonstrating best practices for writing single-block migrations while upgrading pallet storage.

Client Telemetry Crate

-

Check out the docs on Substrate's client telemetry, a part of Substrate that allows ingesting telemetry data with, for example, Polkadot telemetry.

diff --git a/develop/parachains/testing/index.md b/develop/parachains/testing/index.md index e542b335f..c35d9b44d 100644 --- a/develop/parachains/testing/index.md +++ b/develop/parachains/testing/index.md @@ -25,14 +25,12 @@ Through these guides, you'll learn to:

`sp_runtime` crate Rust docs

-

Learn about Substrate Runtime primitives that enable communication between a Substrate blockchain's runtime and client.

Moonwall Testing Framework

-

Moonwall is a comprehensive blockchain test framework for Substrate-based networks.

diff --git a/develop/toolkit/api-libraries/index.md b/develop/toolkit/api-libraries/index.md index 2a311a8e2..df16feeb7 100644 --- a/develop/toolkit/api-libraries/index.md +++ b/develop/toolkit/api-libraries/index.md @@ -18,14 +18,12 @@ Explore the powerful API libraries designed for interacting with the Polkadot ne

Understand Chain Data

-

Familiarize yourself with the data provided by the APIs, including available calls, events, types, and storage items.

Network Configurations

-

Obtain the necessary configurations and WSS endpoints to interact with the APIs on Polkadot networks.

diff --git a/develop/toolkit/parachains/fork-chains/chopsticks/index.md b/develop/toolkit/parachains/fork-chains/chopsticks/index.md index b98174a5c..8e4c1db90 100644 --- a/develop/toolkit/parachains/fork-chains/chopsticks/index.md +++ b/develop/toolkit/parachains/fork-chains/chopsticks/index.md @@ -28,14 +28,12 @@ Whether you're debugging an issue, testing new features, or exploring cross-chai

Chopsticks Repository

-

View the official Chopsticks Github Repository. Check out the code, check out sample commands, and track issues and new releases.

Fork Live Chains with Chopsticks

-

Learn how to fork live Polkadot SDK chains with Chopsticks. Configure forks, replay blocks, test XCM, and interact programmatically or via UI.

diff --git a/develop/toolkit/parachains/fork-chains/index.md b/develop/toolkit/parachains/fork-chains/index.md index a25d6a5f0..d2e842e2d 100644 --- a/develop/toolkit/parachains/fork-chains/index.md +++ b/develop/toolkit/parachains/fork-chains/index.md @@ -29,7 +29,6 @@ Forking a live chain creates a controlled environment that mirrors live network

Step-by-Step Tutorial on Forking Live Chains with Chopsticks

-

This tutorial walks you through how to fork live Polkadot SDK chains with Chopsticks. Configure forks, replay blocks, test XCM execution.

diff --git a/develop/toolkit/parachains/spawn-chains/index.md b/develop/toolkit/parachains/spawn-chains/index.md index 12c7bb1fc..5f4ebf7e5 100644 --- a/develop/toolkit/parachains/spawn-chains/index.md +++ b/develop/toolkit/parachains/spawn-chains/index.md @@ -29,7 +29,6 @@ Spawning a network provides a controlled environment to test and validate variou

Spawn a Chain with Zombienet

-

Learn to spawn, connect to and monitor a basic blockchain network with Zombienet, using customizable configurations for streamlined development and debugging.

\ No newline at end of file diff --git a/develop/toolkit/parachains/spawn-chains/zombienet/index.md b/develop/toolkit/parachains/spawn-chains/zombienet/index.md index 64ea769c2..b134cc1de 100644 --- a/develop/toolkit/parachains/spawn-chains/zombienet/index.md +++ b/develop/toolkit/parachains/spawn-chains/zombienet/index.md @@ -27,7 +27,6 @@ Whether you're building a new parachain or testing runtime upgrades, Zombienet p

Spawn a Chain with Zombienet Tutorial

-

Follow step-by-step instructions to spawn, connect to and monitor a basic blockchain network with Zombienet, using customizable configurations for streamlined development and debugging.

diff --git a/infrastructure/running-a-validator/index.md b/infrastructure/running-a-validator/index.md index b6964df1b..a058f70b8 100644 --- a/infrastructure/running-a-validator/index.md +++ b/infrastructure/running-a-validator/index.md @@ -20,21 +20,18 @@ Learn the requirements for setting up a Polkadot validator node, along with deta

Explore Rewards, Offenses, and Slashes

-

Learn about Polkadot's offenses and slashing system, along with validator rewards, era points, and nominator payments.

Check Out the Decentralized Nodes Program

-

The Decentralized Nodes program aims to support Polkadot's security and decentralization by involving a diverse set of validators. Learn more and apply.

Get Help and Connect With Experts

-

For help, connect with the Polkadot Validator Lounge on Element, where both the team and experienced validators are ready to assist.

diff --git a/infrastructure/running-a-validator/onboarding-and-offboarding/index.md b/infrastructure/running-a-validator/onboarding-and-offboarding/index.md index 5cb8a14d7..46a1de413 100644 --- a/infrastructure/running-a-validator/onboarding-and-offboarding/index.md +++ b/infrastructure/running-a-validator/onboarding-and-offboarding/index.md @@ -20,28 +20,24 @@ This section provides guidance on how to set up, activate, and deactivate your v

Review the Requirements

-

Explore the technical and system requirements for running a Polkadot validator, including setup, hardware, staking prerequisites, and security best practices.

Learn About Staking Mechanics

-

Explore the staking mechanics in Polkadot, focusing on how they relate to validators, including offenses and slashes, as well as reward payouts.

Maintain Your Node

-

Learn how to manage your Polkadot validator node, including monitoring performance, running a backup validator for maintenance, and rotating keys.

Get Help and Connect With Experts

-

For help, connect with the Polkadot Validator Lounge on Element, where both the team and experienced validators are ready to assist.

diff --git a/infrastructure/running-a-validator/operational-tasks/index.md b/infrastructure/running-a-validator/operational-tasks/index.md index 38217d1fc..978f26417 100644 --- a/infrastructure/running-a-validator/operational-tasks/index.md +++ b/infrastructure/running-a-validator/operational-tasks/index.md @@ -18,14 +18,12 @@ Running a Polkadot validator node involves several key operational tasks to ensu

Access Real-Time Validator Metrics

-

Check the Polkadot Telemetry dashboard for real-time insights into node performance, including validator status, connectivity, block production, and software version to identify potential issues.

Stay Up to Date with Runtime Upgrades

-

Learn how to monitor the Polkadot network for upcoming upgrades, so you can prepare your validator node for any required updates or modifications.

diff --git a/infrastructure/staking-mechanics/index.md b/infrastructure/staking-mechanics/index.md index ea3dc258f..305ad7719 100644 --- a/infrastructure/staking-mechanics/index.md +++ b/infrastructure/staking-mechanics/index.md @@ -18,21 +18,18 @@ Gain a deep understanding of the staking mechanics in Polkadot, with a focus on

Learn About Nominated Proof of Staking

-

Take a deeper dive into the fundamentals of Polkadot's Nominated Proof of Stake (NPoS) consensus mechanism.

Dive Deep into Slashing Mechanisms

-

Read the Web3 Foundation's research article on slashing mechanisms for a comprehensive understanding of slashing, along with an in-depth examination of the offenses involved.

Review Validator Rewards Metrics

-

Check out Dune's Polkadot Staking Rewards dashboard for a detailed look at validator-specific metrics over time, such as daily staking rewards, nominators count, reward points, and more.

diff --git a/tutorials/dapps/index.md b/tutorials/dapps/index.md index a67b8a74e..0e425bdc5 100644 --- a/tutorials/dapps/index.md +++ b/tutorials/dapps/index.md @@ -20,14 +20,12 @@ You'll explore a range of topics—from client-side apps and CLI tools to on-cha

Polkadot API (PAPI)

-

Learn how to use the Polkadot API to build dApps that interact with Polkadot SDK-based chains directly via RPC or light clients.

Start Building on Polkadot

-

Get an overview of the tools, SDKs, and templates available for building with Polkadot—from runtime development to frontend integration.

diff --git a/tutorials/index.md b/tutorials/index.md index 86de48570..96e58d0ea 100644 --- a/tutorials/index.md +++ b/tutorials/index.md @@ -20,7 +20,6 @@ The Zero to Hero series offers step-by-step guidance to development across the P

Parachain Zero to Hero

-

Begin with a template then follow this series of step-by-step guides to add pallets, write unit tests and benchmarking, run your parachain locally, perform runtime upgrades, deploy to TestNet, and obtain coretime.

@@ -32,28 +31,24 @@ The Zero to Hero series offers step-by-step guidance to development across the P

Set Up a Template

-

Learn to compile and run a local parachain node using Polkadot SDK. Launch, run, and interact with a pre-configured runtime template.

Build a Custom Pallet

-

Learn how to build a custom pallet for Polkadot SDK-based blockchains with this step-by-step guide. Create and configure a simple counter pallet from scratch.

Fork a Live Chain with Chopsticks

-

Learn how to fork live Polkadot SDK chains with Chopsticks. Configure forks, replay blocks, test XCM, and interact programmatically or via UI.

Open an XCM Channel

-

Learn how to open HRMP channels between parachains on Polkadot. Discover the step-by-step process for establishing uni- and bidirectional communication.

diff --git a/tutorials/interoperability/index.md b/tutorials/interoperability/index.md index 83c3e4550..4d27d61e1 100644 --- a/tutorials/interoperability/index.md +++ b/tutorials/interoperability/index.md @@ -31,14 +31,12 @@ Learn to establish and use cross-chain communication channels:

Learn about Polkadot's Interoperability

-

Explore the importance of interoperability in the Polkadot ecosystem, covering XCM, bridges, and cross-chain communication.

Explore Comprehensive XCM Guides

-

Looking for comprehensive guides and technical resources on XCM? Explore foundational concepts, advanced configuration, and best practices for building cross-chain solutions using XCM.

diff --git a/tutorials/interoperability/xcm-channels/index.md b/tutorials/interoperability/xcm-channels/index.md index a927df0b6..47253b6e9 100644 --- a/tutorials/interoperability/xcm-channels/index.md +++ b/tutorials/interoperability/xcm-channels/index.md @@ -26,7 +26,6 @@ To enable communication between parachains, explicit HRMP channels must be estab

Review HRMP Configurations and Extrinsics

-

Learn about the configurable parameters that govern HRMP channel behavior and the dispatchable extrinsics used to manage them.

diff --git a/tutorials/onchain-governance/index.md b/tutorials/onchain-governance/index.md index fb270890d..2d68e137f 100644 --- a/tutorials/onchain-governance/index.md +++ b/tutorials/onchain-governance/index.md @@ -20,7 +20,6 @@ This section provides step-by-step tutorials to help you navigate the technical

Learn More About Polkadot's OpenGov

-

Explore Polkadot's decentralized on-chain governance system, OpenGov, including how it works, the proposal process, and key info for developers.

diff --git a/tutorials/polkadot-sdk/index.md b/tutorials/polkadot-sdk/index.md index 633105c3c..d9f4ee471 100644 --- a/tutorials/polkadot-sdk/index.md +++ b/tutorials/polkadot-sdk/index.md @@ -28,7 +28,6 @@ Follow these key milestones to guide you through parachain development. Each ste

View the Polkadot SDK Source Code

-

Check out the Polkadot SDK repository on GitHub to explore the source code and stay updated on the latest releases.

diff --git a/tutorials/polkadot-sdk/system-chains/asset-hub/index.md b/tutorials/polkadot-sdk/system-chains/asset-hub/index.md index 692ed8e8a..fc6a91236 100644 --- a/tutorials/polkadot-sdk/system-chains/asset-hub/index.md +++ b/tutorials/polkadot-sdk/system-chains/asset-hub/index.md @@ -35,7 +35,6 @@ Through these tutorials, you'll learn how to manage cross-chain assets, includin

Learn More About Asset Hub

-

Explore the fundamentals of Asset Hub, including managing on-chain assets, foreign asset integration, and using XCM for cross-chain asset transfers.

From 37701a1ba8d61682e7d8976c59426c14e3e168d9 Mon Sep 17 00:00:00 2001 From: 0xlukem Date: Fri, 24 Oct 2025 18:12:37 -0300 Subject: [PATCH 2/3] llms --- .ai/categories/basics.md | 189 +-- .ai/categories/dapps.md | 195 +-- .ai/categories/infrastructure.md | 234 +--- .ai/categories/networks.md | 195 +-- .ai/categories/parachains.md | 358 +----- .ai/categories/polkadot-protocol.md | 195 +-- .ai/categories/reference.md | 6 +- .ai/categories/smart-contracts.md | 413 +------ .ai/categories/tooling.md | 1056 +---------------- .../develop-interoperability-send-messages.md | 3 +- ...develop-interoperability-test-and-debug.md | 56 +- .../develop-interoperability-xcm-guides.md | 4 - ...velop-interoperability-xcm-runtime-apis.md | 6 +- .ai/pages/develop-interoperability.md | 4 - .../develop-parachains-customize-parachain.md | 3 - ...-deployment-build-deterministic-runtime.md | 24 +- .ai/pages/develop-parachains-deployment.md | 1 - ...rachains-maintenance-storage-migrations.md | 106 +- .ai/pages/develop-parachains-maintenance.md | 2 - ...develop-parachains-testing-benchmarking.md | 35 +- .ai/pages/develop-parachains-testing.md | 2 - ...s-precompiles-interact-with-precompiles.md | 218 ---- .ai/pages/develop-toolkit-api-libraries.md | 2 - ...olkit-parachains-fork-chains-chopsticks.md | 2 - .../develop-toolkit-parachains-fork-chains.md | 1 - ...olkit-parachains-spawn-chains-zombienet.md | 1 - ...develop-toolkit-parachains-spawn-chains.md | 1 - ...arding-and-offboarding-start-validating.md | 39 +- ...-a-validator-onboarding-and-offboarding.md | 4 - ...e-running-a-validator-operational-tasks.md | 2 - .../infrastructure-running-a-validator.md | 3 - .ai/pages/infrastructure-staking-mechanics.md | 3 - ...adot-protocol-parachain-basics-accounts.md | 30 +- .ai/pages/tutorials-dapps.md | 2 - ...tutorials-interoperability-xcm-channels.md | 1 - .ai/pages/tutorials-interoperability.md | 2 - .ai/pages/tutorials-onchain-governance.md | 1 - ...ins-zero-to-hero-add-pallets-to-runtime.md | 56 +- ...ls-polkadot-sdk-system-chains-asset-hub.md | 1 - .ai/pages/tutorials-polkadot-sdk.md | 1 - ...nch-your-first-project-create-contracts.md | 44 +- ...our-first-project-create-dapp-ethers-js.md | 478 +------- ...nch-your-first-project-create-dapp-viem.md | 362 +----- ...st-project-test-and-deploy-with-hardhat.md | 45 +- .ai/pages/tutorials.md | 5 - .ai/site-index.json | 714 +++++------ llms-full.jsonl | 194 +-- 47 files changed, 609 insertions(+), 4690 deletions(-) diff --git a/.ai/categories/basics.md b/.ai/categories/basics.md index 09e619b6b..3abb8916f 100644 --- a/.ai/categories/basics.md +++ b/.ai/categories/basics.md @@ -140,7 +140,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - [dependencies] + ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -287,63 +287,13 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] + ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - [workspace.package] - license = "MIT-0" - authors = ["Parity Technologies "] - homepage = "https://paritytech.github.io/polkadot-sdk/" - repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" - edition = "2021" - - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] - resolver = "2" - - [workspace.dependencies] - parachain-template-runtime = { path = "./runtime", default-features = false } - pallet-parachain-template = { path = "./pallets/template", default-features = false } - clap = { version = "4.5.13" } - color-print = { version = "0.3.4" } - docify = { version = "0.2.9" } - futures = { version = "0.3.31" } - jsonrpsee = { version = "0.24.3" } - log = { version = "0.4.22", default-features = false } - polkadot-sdk = { version = "2503.0.1", default-features = false } - prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } - serde = { version = "1.0.214", default-features = false } - codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } - cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } - hex-literal = { version = "0.4.1", default-features = false } - scale-info = { version = "2.11.6", default-features = false } - serde_json = { version = "1.0.132", default-features = false } - smallvec = { version = "1.11.0", default-features = false } - substrate-wasm-builder = { version = "26.0.1", default-features = false } - frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } - - [profile.release] - opt-level = 3 - panic = "unwind" - - [profile.production] - codegen-units = 1 - inherits = "release" - lto = true + ``` @@ -1684,53 +1634,13 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ## Understanding the Code @@ -5018,16 +4928,7 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - /// The full account information for a particular account ID. - #[pallet::storage] - #[pallet::getter(fn account)] - pub type Account = StorageMap< - _, - Blake2_128Concat, - T::AccountId, - AccountInfo, - ValueQuery, - >; + ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -5051,24 +4952,7 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs -/// Information of an account. -#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] -pub struct AccountInfo { - /// The number of transactions this account has sent. - pub nonce: Nonce, - /// The number of other modules that currently depend on this account's existence. The account - /// cannot be reaped until this is zero. - pub consumers: RefCount, - /// The number of other modules that allow this account to exist. The account may not be reaped - /// until this and `sufficients` are both zero. - pub providers: RefCount, - /// The number of modules that allow this account to exist for their own purposes only. The - /// account may not be reaped until this and `providers` are both zero. - pub sufficients: RefCount, - /// The additional data that belongs to this account. Used to store the balance(s) in a lot of - /// chains. - pub data: AccountData, -} + ``` The `AccountInfo` structure includes the following components: @@ -5779,8 +5663,7 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust -pub type PriceForChildParachainDelivery = - ExponentialPrice; + ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -6602,69 +6485,19 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - decl_test_parachains! { - pub struct AssetHubWestend { - genesis = genesis::genesis(), - on_init = { - asset_hub_westend_runtime::AuraExt::on_initialize(1); - }, - runtime = asset_hub_westend_runtime, - core = { - XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, - LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, - ParachainInfo: asset_hub_westend_runtime::ParachainInfo, - MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, - DigestProvider: (), - }, - pallets = { - PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, - Balances: asset_hub_westend_runtime::Balances, - Assets: asset_hub_westend_runtime::Assets, - ForeignAssets: asset_hub_westend_runtime::ForeignAssets, - PoolAssets: asset_hub_westend_runtime::PoolAssets, - AssetConversion: asset_hub_westend_runtime::AssetConversion, - SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, - Revive: asset_hub_westend_runtime::Revive, - } - }, - } + ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - decl_test_bridges! { - pub struct RococoWestendMockBridge { - source = BridgeHubRococoPara, - target = BridgeHubWestendPara, - handler = RococoWestendMessageHandler - }, - pub struct WestendRococoMockBridge { - source = BridgeHubWestendPara, - target = BridgeHubRococoPara, - handler = WestendRococoMessageHandler - } - } + ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - decl_test_networks! { - pub struct WestendMockNet { - relay_chain = Westend, - parachains = vec![ - AssetHubWestend, - BridgeHubWestend, - CollectivesWestend, - CoretimeWestend, - PeopleWestend, - PenpalA, - PenpalB, - ], - bridge = () - }, - } + ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. diff --git a/.ai/categories/dapps.md b/.ai/categories/dapps.md index ec32cef75..098dc2577 100644 --- a/.ai/categories/dapps.md +++ b/.ai/categories/dapps.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - [dependencies] + ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -288,63 +288,13 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] + ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - [workspace.package] - license = "MIT-0" - authors = ["Parity Technologies "] - homepage = "https://paritytech.github.io/polkadot-sdk/" - repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" - edition = "2021" - - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] - resolver = "2" - - [workspace.dependencies] - parachain-template-runtime = { path = "./runtime", default-features = false } - pallet-parachain-template = { path = "./pallets/template", default-features = false } - clap = { version = "4.5.13" } - color-print = { version = "0.3.4" } - docify = { version = "0.2.9" } - futures = { version = "0.3.31" } - jsonrpsee = { version = "0.24.3" } - log = { version = "0.4.22", default-features = false } - polkadot-sdk = { version = "2503.0.1", default-features = false } - prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } - serde = { version = "1.0.214", default-features = false } - codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } - cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } - hex-literal = { version = "0.4.1", default-features = false } - scale-info = { version = "2.11.6", default-features = false } - serde_json = { version = "1.0.132", default-features = false } - smallvec = { version = "1.11.0", default-features = false } - substrate-wasm-builder = { version = "26.0.1", default-features = false } - frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } - - [profile.release] - opt-level = 3 - panic = "unwind" - - [profile.production] - codegen-units = 1 - inherits = "release" - lto = true + ``` @@ -2044,53 +1994,13 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ## Understanding the Code @@ -7375,16 +7285,7 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - /// The full account information for a particular account ID. - #[pallet::storage] - #[pallet::getter(fn account)] - pub type Account = StorageMap< - _, - Blake2_128Concat, - T::AccountId, - AccountInfo, - ValueQuery, - >; + ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -7408,24 +7309,7 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs -/// Information of an account. -#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] -pub struct AccountInfo { - /// The number of transactions this account has sent. - pub nonce: Nonce, - /// The number of other modules that currently depend on this account's existence. The account - /// cannot be reaped until this is zero. - pub consumers: RefCount, - /// The number of other modules that allow this account to exist. The account may not be reaped - /// until this and `sufficients` are both zero. - pub providers: RefCount, - /// The number of modules that allow this account to exist for their own purposes only. The - /// account may not be reaped until this and `providers` are both zero. - pub sufficients: RefCount, - /// The additional data that belongs to this account. Used to store the balance(s) in a lot of - /// chains. - pub data: AccountData, -} + ``` The `AccountInfo` structure includes the following components: @@ -8809,8 +8693,7 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust -pub type PriceForChildParachainDelivery = - ExponentialPrice; + ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -10000,69 +9883,19 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - decl_test_parachains! { - pub struct AssetHubWestend { - genesis = genesis::genesis(), - on_init = { - asset_hub_westend_runtime::AuraExt::on_initialize(1); - }, - runtime = asset_hub_westend_runtime, - core = { - XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, - LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, - ParachainInfo: asset_hub_westend_runtime::ParachainInfo, - MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, - DigestProvider: (), - }, - pallets = { - PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, - Balances: asset_hub_westend_runtime::Balances, - Assets: asset_hub_westend_runtime::Assets, - ForeignAssets: asset_hub_westend_runtime::ForeignAssets, - PoolAssets: asset_hub_westend_runtime::PoolAssets, - AssetConversion: asset_hub_westend_runtime::AssetConversion, - SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, - Revive: asset_hub_westend_runtime::Revive, - } - }, - } + ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - decl_test_bridges! { - pub struct RococoWestendMockBridge { - source = BridgeHubRococoPara, - target = BridgeHubWestendPara, - handler = RococoWestendMessageHandler - }, - pub struct WestendRococoMockBridge { - source = BridgeHubWestendPara, - target = BridgeHubRococoPara, - handler = WestendRococoMessageHandler - } - } + ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - decl_test_networks! { - pub struct WestendMockNet { - relay_chain = Westend, - parachains = vec![ - AssetHubWestend, - BridgeHubWestend, - CollectivesWestend, - CoretimeWestend, - PeopleWestend, - PenpalA, - PenpalB, - ], - bridge = () - }, - } + ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -11340,7 +11173,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust -fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -11617,7 +11450,7 @@ fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersio This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust -fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -11848,7 +11681,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust -fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; + ``` ??? interface "Input parameters" diff --git a/.ai/categories/infrastructure.md b/.ai/categories/infrastructure.md index 41161c4ab..2588c71da 100644 --- a/.ai/categories/infrastructure.md +++ b/.ai/categories/infrastructure.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - [dependencies] + ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -288,63 +288,13 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] + ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - [workspace.package] - license = "MIT-0" - authors = ["Parity Technologies "] - homepage = "https://paritytech.github.io/polkadot-sdk/" - repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" - edition = "2021" - - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] - resolver = "2" - - [workspace.dependencies] - parachain-template-runtime = { path = "./runtime", default-features = false } - pallet-parachain-template = { path = "./pallets/template", default-features = false } - clap = { version = "4.5.13" } - color-print = { version = "0.3.4" } - docify = { version = "0.2.9" } - futures = { version = "0.3.31" } - jsonrpsee = { version = "0.24.3" } - log = { version = "0.4.22", default-features = false } - polkadot-sdk = { version = "2503.0.1", default-features = false } - prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } - serde = { version = "1.0.214", default-features = false } - codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } - cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } - hex-literal = { version = "0.4.1", default-features = false } - scale-info = { version = "2.11.6", default-features = false } - serde_json = { version = "1.0.132", default-features = false } - smallvec = { version = "1.11.0", default-features = false } - substrate-wasm-builder = { version = "26.0.1", default-features = false } - frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } - - [profile.release] - opt-level = 3 - panic = "unwind" - - [profile.production] - codegen-units = 1 - inherits = "release" - lto = true + ``` @@ -1685,53 +1635,13 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ## Understanding the Code @@ -7206,16 +7116,7 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - /// The full account information for a particular account ID. - #[pallet::storage] - #[pallet::getter(fn account)] - pub type Account = StorageMap< - _, - Blake2_128Concat, - T::AccountId, - AccountInfo, - ValueQuery, - >; + ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -7239,24 +7140,7 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs -/// Information of an account. -#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] -pub struct AccountInfo { - /// The number of transactions this account has sent. - pub nonce: Nonce, - /// The number of other modules that currently depend on this account's existence. The account - /// cannot be reaped until this is zero. - pub consumers: RefCount, - /// The number of other modules that allow this account to exist. The account may not be reaped - /// until this and `sufficients` are both zero. - pub providers: RefCount, - /// The number of modules that allow this account to exist for their own purposes only. The - /// account may not be reaped until this and `providers` are both zero. - pub sufficients: RefCount, - /// The additional data that belongs to this account. Used to store the balance(s) in a lot of - /// chains. - pub data: AccountData, -} + ``` The `AccountInfo` structure includes the following components: @@ -8174,8 +8058,7 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust -pub type PriceForChildParachainDelivery = - ExponentialPrice; + ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -9921,44 +9804,7 @@ touch /etc/systemd/system/polkadot-validator.service In this unit file, you will write the commands that you want to run on server boot/restart: ```systemd title="/etc/systemd/system/polkadot-validator.service" -[Unit] -Description=Polkadot Node -After=network.target -Documentation=https://github.com/paritytech/polkadot-sdk - -[Service] -EnvironmentFile=-/etc/default/polkadot -ExecStart=/usr/bin/polkadot $POLKADOT_CLI_ARGS -User=polkadot -Group=polkadot -Restart=always -RestartSec=120 -CapabilityBoundingSet= -LockPersonality=true -NoNewPrivileges=true -PrivateDevices=true -PrivateMounts=true -PrivateTmp=true -PrivateUsers=true -ProtectClock=true -ProtectControlGroups=true -ProtectHostname=true -ProtectKernelModules=true -ProtectKernelTunables=true -ProtectSystem=strict -RemoveIPC=true -RestrictAddressFamilies=AF_INET AF_INET6 AF_NETLINK AF_UNIX -RestrictNamespaces=false -RestrictSUIDSGID=true -SystemCallArchitectures=native -SystemCallFilter=@system-service -SystemCallFilter=landlock_add_rule landlock_create_ruleset landlock_restrict_self seccomp mount umount2 -SystemCallFilter=~@clock @module @reboot @swap @privileged -SystemCallFilter=pivot_root -UMask=0027 - -[Install] -WantedBy=multi-user.target + ``` !!! warning "Restart delay and equivocation risk" @@ -10123,69 +9969,19 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - decl_test_parachains! { - pub struct AssetHubWestend { - genesis = genesis::genesis(), - on_init = { - asset_hub_westend_runtime::AuraExt::on_initialize(1); - }, - runtime = asset_hub_westend_runtime, - core = { - XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, - LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, - ParachainInfo: asset_hub_westend_runtime::ParachainInfo, - MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, - DigestProvider: (), - }, - pallets = { - PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, - Balances: asset_hub_westend_runtime::Balances, - Assets: asset_hub_westend_runtime::Assets, - ForeignAssets: asset_hub_westend_runtime::ForeignAssets, - PoolAssets: asset_hub_westend_runtime::PoolAssets, - AssetConversion: asset_hub_westend_runtime::AssetConversion, - SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, - Revive: asset_hub_westend_runtime::Revive, - } - }, - } + ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - decl_test_bridges! { - pub struct RococoWestendMockBridge { - source = BridgeHubRococoPara, - target = BridgeHubWestendPara, - handler = RococoWestendMessageHandler - }, - pub struct WestendRococoMockBridge { - source = BridgeHubWestendPara, - target = BridgeHubRococoPara, - handler = WestendRococoMessageHandler - } - } + ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - decl_test_networks! { - pub struct WestendMockNet { - relay_chain = Westend, - parachains = vec![ - AssetHubWestend, - BridgeHubWestend, - CollectivesWestend, - CoretimeWestend, - PeopleWestend, - PenpalA, - PenpalB, - ], - bridge = () - }, - } + ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -11694,7 +11490,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust -fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -11971,7 +11767,7 @@ fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersio This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust -fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -12202,7 +11998,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust -fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; + ``` ??? interface "Input parameters" diff --git a/.ai/categories/networks.md b/.ai/categories/networks.md index 4b6c6f319..6bbf52aec 100644 --- a/.ai/categories/networks.md +++ b/.ai/categories/networks.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - [dependencies] + ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -288,63 +288,13 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] + ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - [workspace.package] - license = "MIT-0" - authors = ["Parity Technologies "] - homepage = "https://paritytech.github.io/polkadot-sdk/" - repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" - edition = "2021" - - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] - resolver = "2" - - [workspace.dependencies] - parachain-template-runtime = { path = "./runtime", default-features = false } - pallet-parachain-template = { path = "./pallets/template", default-features = false } - clap = { version = "4.5.13" } - color-print = { version = "0.3.4" } - docify = { version = "0.2.9" } - futures = { version = "0.3.31" } - jsonrpsee = { version = "0.24.3" } - log = { version = "0.4.22", default-features = false } - polkadot-sdk = { version = "2503.0.1", default-features = false } - prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } - serde = { version = "1.0.214", default-features = false } - codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } - cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } - hex-literal = { version = "0.4.1", default-features = false } - scale-info = { version = "2.11.6", default-features = false } - serde_json = { version = "1.0.132", default-features = false } - smallvec = { version = "1.11.0", default-features = false } - substrate-wasm-builder = { version = "26.0.1", default-features = false } - frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } - - [profile.release] - opt-level = 3 - panic = "unwind" - - [profile.production] - codegen-units = 1 - inherits = "release" - lto = true + ``` @@ -1685,53 +1635,13 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ## Understanding the Code @@ -6258,16 +6168,7 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - /// The full account information for a particular account ID. - #[pallet::storage] - #[pallet::getter(fn account)] - pub type Account = StorageMap< - _, - Blake2_128Concat, - T::AccountId, - AccountInfo, - ValueQuery, - >; + ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -6291,24 +6192,7 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs -/// Information of an account. -#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] -pub struct AccountInfo { - /// The number of transactions this account has sent. - pub nonce: Nonce, - /// The number of other modules that currently depend on this account's existence. The account - /// cannot be reaped until this is zero. - pub consumers: RefCount, - /// The number of other modules that allow this account to exist. The account may not be reaped - /// until this and `sufficients` are both zero. - pub providers: RefCount, - /// The number of modules that allow this account to exist for their own purposes only. The - /// account may not be reaped until this and `providers` are both zero. - pub sufficients: RefCount, - /// The additional data that belongs to this account. Used to store the balance(s) in a lot of - /// chains. - pub data: AccountData, -} + ``` The `AccountInfo` structure includes the following components: @@ -7019,8 +6903,7 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust -pub type PriceForChildParachainDelivery = - ExponentialPrice; + ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -7842,69 +7725,19 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - decl_test_parachains! { - pub struct AssetHubWestend { - genesis = genesis::genesis(), - on_init = { - asset_hub_westend_runtime::AuraExt::on_initialize(1); - }, - runtime = asset_hub_westend_runtime, - core = { - XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, - LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, - ParachainInfo: asset_hub_westend_runtime::ParachainInfo, - MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, - DigestProvider: (), - }, - pallets = { - PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, - Balances: asset_hub_westend_runtime::Balances, - Assets: asset_hub_westend_runtime::Assets, - ForeignAssets: asset_hub_westend_runtime::ForeignAssets, - PoolAssets: asset_hub_westend_runtime::PoolAssets, - AssetConversion: asset_hub_westend_runtime::AssetConversion, - SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, - Revive: asset_hub_westend_runtime::Revive, - } - }, - } + ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - decl_test_bridges! { - pub struct RococoWestendMockBridge { - source = BridgeHubRococoPara, - target = BridgeHubWestendPara, - handler = RococoWestendMessageHandler - }, - pub struct WestendRococoMockBridge { - source = BridgeHubWestendPara, - target = BridgeHubRococoPara, - handler = WestendRococoMessageHandler - } - } + ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - decl_test_networks! { - pub struct WestendMockNet { - relay_chain = Westend, - parachains = vec![ - AssetHubWestend, - BridgeHubWestend, - CollectivesWestend, - CoretimeWestend, - PeopleWestend, - PenpalA, - PenpalB, - ], - bridge = () - }, - } + ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -9089,7 +8922,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust -fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -9366,7 +9199,7 @@ fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersio This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust -fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -9597,7 +9430,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust -fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; + ``` ??? interface "Input parameters" diff --git a/.ai/categories/parachains.md b/.ai/categories/parachains.md index a10d6b94a..1bdcf1ac0 100644 --- a/.ai/categories/parachains.md +++ b/.ai/categories/parachains.md @@ -538,7 +538,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - [dependencies] + ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -685,63 +685,13 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] + ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - [workspace.package] - license = "MIT-0" - authors = ["Parity Technologies "] - homepage = "https://paritytech.github.io/polkadot-sdk/" - repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" - edition = "2021" - - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] - resolver = "2" - - [workspace.dependencies] - parachain-template-runtime = { path = "./runtime", default-features = false } - pallet-parachain-template = { path = "./pallets/template", default-features = false } - clap = { version = "4.5.13" } - color-print = { version = "0.3.4" } - docify = { version = "0.2.9" } - futures = { version = "0.3.31" } - jsonrpsee = { version = "0.24.3" } - log = { version = "0.4.22", default-features = false } - polkadot-sdk = { version = "2503.0.1", default-features = false } - prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } - serde = { version = "1.0.214", default-features = false } - codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } - cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } - hex-literal = { version = "0.4.1", default-features = false } - scale-info = { version = "2.11.6", default-features = false } - serde_json = { version = "1.0.132", default-features = false } - smallvec = { version = "1.11.0", default-features = false } - substrate-wasm-builder = { version = "26.0.1", default-features = false } - frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } - - [profile.release] - opt-level = 3 - panic = "unwind" - - [profile.production] - codegen-units = 1 - inherits = "release" - lto = true + ``` @@ -1048,40 +998,7 @@ my-pallet/ With the directory structure set, you can use the [`polkadot-sdk-parachain-template`](https://github.com/paritytech/polkadot-sdk-parachain-template/tree/master/pallets){target=\_blank} to get started as follows: ```rust title="benchmarking.rs (starter template)" -//! Benchmarking setup for pallet-template -#![cfg(feature = "runtime-benchmarks")] -use super::*; -use frame_benchmarking::v2::*; - -#[benchmarks] -mod benchmarks { - use super::*; - #[cfg(test)] - use crate::pallet::Pallet as Template; - use frame_system::RawOrigin; - - #[benchmark] - fn do_something() { - let caller: T::AccountId = whitelisted_caller(); - #[extrinsic_call] - do_something(RawOrigin::Signed(caller), 100); - - assert_eq!(Something::::get().map(|v| v.block_number), Some(100u32.into())); - } - - #[benchmark] - fn cause_error() { - Something::::put(CompositeStruct { block_number: 100u32.into() }); - let caller: T::AccountId = whitelisted_caller(); - #[extrinsic_call] - cause_error(RawOrigin::Signed(caller)); - - assert_eq!(Something::::get().map(|v| v.block_number), Some(101u32.into())); - } - - impl_benchmark_test_suite!(Template, crate::mock::new_test_ext(), crate::mock::Test); -} ``` In your benchmarking tests, employ these best practices: @@ -2114,29 +2031,7 @@ To add a GitHub workflow for building the runtime: {% raw %} ```yml - name: Srtool build - - on: push - - jobs: - srtool: - runs-on: ubuntu-latest - strategy: - matrix: - chain: ["asset-hub-kusama", "asset-hub-westend"] - steps: - - uses: actions/checkout@v3 - - name: Srtool build - id: srtool_build - uses: chevdor/srtool-actions@v0.8.0 - with: - chain: ${{ matrix.chain }} - runtime_dir: polkadot-parachains/${{ matrix.chain }}-runtime - - name: Summary - run: | - echo '${{ steps.srtool_build.outputs.json }}' | jq . > ${{ matrix.chain }}-srtool-digest.json - cat ${{ matrix.chain }}-srtool-digest.json - echo "Runtime location: ${{ steps.srtool_build.outputs.wasm }}" + ``` {% endraw %} @@ -2897,53 +2792,13 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ## Understanding the Code @@ -12460,16 +12315,7 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - /// The full account information for a particular account ID. - #[pallet::storage] - #[pallet::getter(fn account)] - pub type Account = StorageMap< - _, - Blake2_128Concat, - T::AccountId, - AccountInfo, - ValueQuery, - >; + ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -12493,24 +12339,7 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs -/// Information of an account. -#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] -pub struct AccountInfo { - /// The number of transactions this account has sent. - pub nonce: Nonce, - /// The number of other modules that currently depend on this account's existence. The account - /// cannot be reaped until this is zero. - pub consumers: RefCount, - /// The number of other modules that allow this account to exist. The account may not be reaped - /// until this and `sufficients` are both zero. - pub providers: RefCount, - /// The number of modules that allow this account to exist for their own purposes only. The - /// account may not be reaped until this and `providers` are both zero. - pub sufficients: RefCount, - /// The additional data that belongs to this account. Used to store the balance(s) in a lot of - /// chains. - pub data: AccountData, -} + ``` The `AccountInfo` structure includes the following components: @@ -13500,8 +13329,7 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust -pub type PriceForChildParachainDelivery = - ExponentialPrice; + ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -14416,111 +14244,7 @@ Examine the following migration example that transforms a simple `StorageValue` - Migration: ```rust - use frame_support::{ - storage_alias, - traits::{Get, UncheckedOnRuntimeUpgrade}, - }; - - #[cfg(feature = "try-runtime")] - use alloc::vec::Vec; - - /// Collection of storage item formats from the previous storage version. - /// - /// Required so we can read values in the v0 storage format during the migration. - mod v0 { - use super::*; - - /// V0 type for [`crate::Value`]. - #[storage_alias] - pub type Value = StorageValue, u32>; - } - - /// Implements [`UncheckedOnRuntimeUpgrade`], migrating the state of this pallet from V0 to V1. - /// - /// In V0 of the template [`crate::Value`] is just a `u32`. In V1, it has been upgraded to - /// contain the struct [`crate::CurrentAndPreviousValue`]. - /// - /// In this migration, update the on-chain storage for the pallet to reflect the new storage - /// layout. - pub struct InnerMigrateV0ToV1(core::marker::PhantomData); - - impl UncheckedOnRuntimeUpgrade for InnerMigrateV0ToV1 { - /// Return the existing [`crate::Value`] so we can check that it was correctly set in - /// `InnerMigrateV0ToV1::post_upgrade`. - #[cfg(feature = "try-runtime")] - fn pre_upgrade() -> Result, sp_runtime::TryRuntimeError> { - use codec::Encode; - - // Access the old value using the `storage_alias` type - let old_value = v0::Value::::get(); - // Return it as an encoded `Vec` - Ok(old_value.encode()) - } - - /// Migrate the storage from V0 to V1. - /// - /// - If the value doesn't exist, there is nothing to do. - /// - If the value exists, it is read and then written back to storage inside a - /// [`crate::CurrentAndPreviousValue`]. - fn on_runtime_upgrade() -> frame_support::weights::Weight { - // Read the old value from storage - if let Some(old_value) = v0::Value::::take() { - // Write the new value to storage - let new = crate::CurrentAndPreviousValue { current: old_value, previous: None }; - crate::Value::::put(new); - // One read + write for taking the old value, and one write for setting the new value - T::DbWeight::get().reads_writes(1, 2) - } else { - // No writes since there was no old value, just one read for checking - T::DbWeight::get().reads(1) - } - } - - /// Verifies the storage was migrated correctly. - /// - /// - If there was no old value, the new value should not be set. - /// - If there was an old value, the new value should be a [`crate::CurrentAndPreviousValue`]. - #[cfg(feature = "try-runtime")] - fn post_upgrade(state: Vec) -> Result<(), sp_runtime::TryRuntimeError> { - use codec::Decode; - use frame_support::ensure; - - let maybe_old_value = Option::::decode(&mut &state[..]).map_err(|_| { - sp_runtime::TryRuntimeError::Other("Failed to decode old value from storage") - })?; - - match maybe_old_value { - Some(old_value) => { - let expected_new_value = - crate::CurrentAndPreviousValue { current: old_value, previous: None }; - let actual_new_value = crate::Value::::get(); - - ensure!(actual_new_value.is_some(), "New value not set"); - ensure!( - actual_new_value == Some(expected_new_value), - "New value not set correctly" - ); - }, - None => { - ensure!(crate::Value::::get().is_none(), "New value unexpectedly set"); - }, - }; - Ok(()) - } - } - - /// [`UncheckedOnRuntimeUpgrade`] implementation [`InnerMigrateV0ToV1`] wrapped in a - /// [`VersionedMigration`](frame_support::migrations::VersionedMigration), which ensures that: - /// - The migration only runs once when the on-chain storage version is 0 - /// - The on-chain storage version is updated to `1` after the migration executes - /// - Reads/Writes from checking/settings the on-chain storage version are accounted for - pub type MigrateV0ToV1 = frame_support::migrations::VersionedMigration< - 0, // The migration will only execute when the on-chain storage version is 0 - 1, // The on-chain storage version will be set to 1 after the migration is complete - InnerMigrateV0ToV1, - crate::pallet::Pallet, - ::DbWeight, - >; + ``` ### Migration Organization @@ -14677,69 +14401,19 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - decl_test_parachains! { - pub struct AssetHubWestend { - genesis = genesis::genesis(), - on_init = { - asset_hub_westend_runtime::AuraExt::on_initialize(1); - }, - runtime = asset_hub_westend_runtime, - core = { - XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, - LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, - ParachainInfo: asset_hub_westend_runtime::ParachainInfo, - MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, - DigestProvider: (), - }, - pallets = { - PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, - Balances: asset_hub_westend_runtime::Balances, - Assets: asset_hub_westend_runtime::Assets, - ForeignAssets: asset_hub_westend_runtime::ForeignAssets, - PoolAssets: asset_hub_westend_runtime::PoolAssets, - AssetConversion: asset_hub_westend_runtime::AssetConversion, - SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, - Revive: asset_hub_westend_runtime::Revive, - } - }, - } + ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - decl_test_bridges! { - pub struct RococoWestendMockBridge { - source = BridgeHubRococoPara, - target = BridgeHubWestendPara, - handler = RococoWestendMessageHandler - }, - pub struct WestendRococoMockBridge { - source = BridgeHubWestendPara, - target = BridgeHubRococoPara, - handler = WestendRococoMessageHandler - } - } + ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - decl_test_networks! { - pub struct WestendMockNet { - relay_chain = Westend, - parachains = vec![ - AssetHubWestend, - BridgeHubWestend, - CollectivesWestend, - CoretimeWestend, - PeopleWestend, - PenpalA, - PenpalB, - ], - bridge = () - }, - } + ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -16417,7 +16091,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust -fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -16694,7 +16368,7 @@ fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersio This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust -fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -16925,7 +16599,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust -fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; + ``` ??? interface "Input parameters" diff --git a/.ai/categories/polkadot-protocol.md b/.ai/categories/polkadot-protocol.md index c978d05db..d88e67c5c 100644 --- a/.ai/categories/polkadot-protocol.md +++ b/.ai/categories/polkadot-protocol.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - [dependencies] + ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -288,63 +288,13 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] + ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - [workspace.package] - license = "MIT-0" - authors = ["Parity Technologies "] - homepage = "https://paritytech.github.io/polkadot-sdk/" - repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" - edition = "2021" - - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] - resolver = "2" - - [workspace.dependencies] - parachain-template-runtime = { path = "./runtime", default-features = false } - pallet-parachain-template = { path = "./pallets/template", default-features = false } - clap = { version = "4.5.13" } - color-print = { version = "0.3.4" } - docify = { version = "0.2.9" } - futures = { version = "0.3.31" } - jsonrpsee = { version = "0.24.3" } - log = { version = "0.4.22", default-features = false } - polkadot-sdk = { version = "2503.0.1", default-features = false } - prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } - serde = { version = "1.0.214", default-features = false } - codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } - cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } - hex-literal = { version = "0.4.1", default-features = false } - scale-info = { version = "2.11.6", default-features = false } - serde_json = { version = "1.0.132", default-features = false } - smallvec = { version = "1.11.0", default-features = false } - substrate-wasm-builder = { version = "26.0.1", default-features = false } - frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } - - [profile.release] - opt-level = 3 - panic = "unwind" - - [profile.production] - codegen-units = 1 - inherits = "release" - lto = true + ``` @@ -2105,53 +2055,13 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ## Understanding the Code @@ -7093,16 +7003,7 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - /// The full account information for a particular account ID. - #[pallet::storage] - #[pallet::getter(fn account)] - pub type Account = StorageMap< - _, - Blake2_128Concat, - T::AccountId, - AccountInfo, - ValueQuery, - >; + ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -7126,24 +7027,7 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs -/// Information of an account. -#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] -pub struct AccountInfo { - /// The number of transactions this account has sent. - pub nonce: Nonce, - /// The number of other modules that currently depend on this account's existence. The account - /// cannot be reaped until this is zero. - pub consumers: RefCount, - /// The number of other modules that allow this account to exist. The account may not be reaped - /// until this and `sufficients` are both zero. - pub providers: RefCount, - /// The number of modules that allow this account to exist for their own purposes only. The - /// account may not be reaped until this and `providers` are both zero. - pub sufficients: RefCount, - /// The additional data that belongs to this account. Used to store the balance(s) in a lot of - /// chains. - pub data: AccountData, -} + ``` The `AccountInfo` structure includes the following components: @@ -7986,8 +7870,7 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust -pub type PriceForChildParachainDelivery = - ExponentialPrice; + ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -8809,69 +8692,19 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - decl_test_parachains! { - pub struct AssetHubWestend { - genesis = genesis::genesis(), - on_init = { - asset_hub_westend_runtime::AuraExt::on_initialize(1); - }, - runtime = asset_hub_westend_runtime, - core = { - XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, - LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, - ParachainInfo: asset_hub_westend_runtime::ParachainInfo, - MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, - DigestProvider: (), - }, - pallets = { - PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, - Balances: asset_hub_westend_runtime::Balances, - Assets: asset_hub_westend_runtime::Assets, - ForeignAssets: asset_hub_westend_runtime::ForeignAssets, - PoolAssets: asset_hub_westend_runtime::PoolAssets, - AssetConversion: asset_hub_westend_runtime::AssetConversion, - SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, - Revive: asset_hub_westend_runtime::Revive, - } - }, - } + ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - decl_test_bridges! { - pub struct RococoWestendMockBridge { - source = BridgeHubRococoPara, - target = BridgeHubWestendPara, - handler = RococoWestendMessageHandler - }, - pub struct WestendRococoMockBridge { - source = BridgeHubWestendPara, - target = BridgeHubRococoPara, - handler = WestendRococoMessageHandler - } - } + ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - decl_test_networks! { - pub struct WestendMockNet { - relay_chain = Westend, - parachains = vec![ - AssetHubWestend, - BridgeHubWestend, - CollectivesWestend, - CoretimeWestend, - PeopleWestend, - PenpalA, - PenpalB, - ], - bridge = () - }, - } + ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -10056,7 +9889,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust -fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -10333,7 +10166,7 @@ fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersio This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust -fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -10564,7 +10397,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust -fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; + ``` ??? interface "Input parameters" diff --git a/.ai/categories/reference.md b/.ai/categories/reference.md index bb90f4a59..10698cf30 100644 --- a/.ai/categories/reference.md +++ b/.ai/categories/reference.md @@ -1610,7 +1610,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust -fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -1887,7 +1887,7 @@ fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersio This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust -fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -2118,7 +2118,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust -fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; + ``` ??? interface "Input parameters" diff --git a/.ai/categories/smart-contracts.md b/.ai/categories/smart-contracts.md index 8b967c0fb..d3d25b143 100644 --- a/.ai/categories/smart-contracts.md +++ b/.ai/categories/smart-contracts.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - [dependencies] + ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -288,63 +288,13 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] + ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - [workspace.package] - license = "MIT-0" - authors = ["Parity Technologies "] - homepage = "https://paritytech.github.io/polkadot-sdk/" - repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" - edition = "2021" - - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] - resolver = "2" - - [workspace.dependencies] - parachain-template-runtime = { path = "./runtime", default-features = false } - pallet-parachain-template = { path = "./pallets/template", default-features = false } - clap = { version = "4.5.13" } - color-print = { version = "0.3.4" } - docify = { version = "0.2.9" } - futures = { version = "0.3.31" } - jsonrpsee = { version = "0.24.3" } - log = { version = "0.4.22", default-features = false } - polkadot-sdk = { version = "2503.0.1", default-features = false } - prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } - serde = { version = "1.0.214", default-features = false } - codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } - cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } - hex-literal = { version = "0.4.1", default-features = false } - scale-info = { version = "2.11.6", default-features = false } - serde_json = { version = "1.0.132", default-features = false } - smallvec = { version = "1.11.0", default-features = false } - substrate-wasm-builder = { version = "26.0.1", default-features = false } - frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } - - [profile.release] - opt-level = 3 - panic = "unwind" - - [profile.production] - codegen-units = 1 - inherits = "release" - lto = true + ``` @@ -1834,53 +1784,13 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ## Understanding the Code @@ -4584,30 +4494,7 @@ To interact with the ECRecover precompile, you can deploy the `ECRecoverExample` The SHA-256 precompile computes the SHA-256 hash of the input data. ```solidity title="SHA256.sol" -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.0; - -contract SHA256Example { - event SHA256Called(bytes result); - - // Address of the SHA256 precompile - address constant SHA256_PRECOMPILE = address(0x02); - - bytes public result; - function callH256(bytes calldata input) public { - bool success; - bytes memory resultInMemory; - - (success, resultInMemory) = SHA256_PRECOMPILE.call{value: 0}(input); - - if (success) { - emit SHA256Called(resultInMemory); - } - - result = resultInMemory; - } -} ``` To use it, you can deploy the `SHA256Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call callH256 with arbitrary bytes. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/SHA256.js){target=\_blank} shows how to pass a UTF-8 string, hash it using the precompile, and compare it with the expected hash from Node.js's [crypto](https://www.npmjs.com/package/crypto-js){target=\_blank} module. @@ -4724,38 +4611,7 @@ To use it, you can deploy the `ModExpExample` contract in [Remix](/develop/smart The BN128Add precompile performs addition on the alt_bn128 elliptic curve, which is essential for zk-SNARK operations. ```solidity title="BN128Add.sol" -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.20; - -contract BN128AddExample { - address constant BN128_ADD_PRECOMPILE = address(0x06); - - event BN128Added(uint256 x3, uint256 y3); - - uint256 public resultX; - uint256 public resultY; - - function callBN128Add(uint256 x1, uint256 y1, uint256 x2, uint256 y2) public { - bytes memory input = abi.encodePacked( - bytes32(x1), bytes32(y1), bytes32(x2), bytes32(y2) - ); - - bool success; - bytes memory output; - - (success, output) = BN128_ADD_PRECOMPILE.call{value: 0}(input); - - require(success, "BN128Add precompile call failed"); - require(output.length == 64, "Invalid output length"); - - (uint256 x3, uint256 y3) = abi.decode(output, (uint256, uint256)); - - resultX = x3; - resultY = y3; - emit BN128Added(x3, y3); - } -} ``` To use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `callBN128Add` with valid `alt_bn128` points. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Add.js){target=\_blank} demonstrates a valid curve addition and checks the result against known expected values. @@ -4765,42 +4621,7 @@ To use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/sma The BN128Mul precompile performs scalar multiplication on the alt_bn128 curve. ```solidity title="BN128Mul.sol" -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.0; - -contract BN128MulExample { - // Precompile address for BN128Mul - address constant BN128_MUL_ADDRESS = address(0x07); - - bytes public result; - - // Performs scalar multiplication of a point on the alt_bn128 curve - function bn128ScalarMul(uint256 x1, uint256 y1, uint256 scalar) public { - // Format: [x, y, scalar] - each 32 bytes - bytes memory input = abi.encodePacked( - bytes32(x1), - bytes32(y1), - bytes32(scalar) - ); - - (bool success, bytes memory resultInMemory) = BN128_MUL_ADDRESS.call{ - value: 0 - }(input); - require(success, "BN128Mul precompile call failed"); - - result = resultInMemory; - } - // Helper to decode result from `result` storage - function getResult() public view returns (uint256 x2, uint256 y2) { - bytes memory tempResult = result; - require(tempResult.length >= 64, "Invalid result length"); - assembly { - x2 := mload(add(tempResult, 32)) - y2 := mload(add(tempResult, 64)) - } - } -} ``` To use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `bn128ScalarMul` with a valid point and scalar. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Mul.js){target=\_blank} shows how to test the operation and verify the expected scalar multiplication result on `alt_bn128`. @@ -4810,38 +4631,7 @@ To use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-envi The BN128Pairing precompile verifies a pairing equation on the alt_bn128 curve, which is critical for zk-SNARK verification. ```solidity title="BN128Pairing.sol" -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.0; - -contract BN128PairingExample { - // Precompile address for BN128Pairing - address constant BN128_PAIRING_ADDRESS = address(0x08); - - bytes public result; - - // Performs a pairing check on the alt_bn128 curve - function bn128Pairing(bytes memory input) public { - // Call the precompile - (bool success, bytes memory resultInMemory) = BN128_PAIRING_ADDRESS - .call{value: 0}(input); - require(success, "BN128Pairing precompile call failed"); - result = resultInMemory; - } - - // Helper function to decode the result from `result` storage - function getResult() public view returns (bool isValid) { - bytes memory tempResult = result; - require(tempResult.length == 32, "Invalid result length"); - - uint256 output; - assembly { - output := mload(add(tempResult, 32)) - } - - isValid = (output == 1); - } -} ``` You can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or your preferred environment. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Pairing.js){target=\_blank} contains these tests with working examples. @@ -4851,105 +4641,7 @@ You can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-env The Blake2F precompile performs the Blake2 compression function F, which is the core of the Blake2 hash function. ```solidity title="Blake2F.sol" -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.0; - -contract Blake2FExample { - // Precompile address for Blake2F - address constant BLAKE2F_ADDRESS = address(0x09); - - bytes public result; - - function blake2F(bytes memory input) public { - // Input must be exactly 213 bytes - require(input.length == 213, "Invalid input length - must be 213 bytes"); - - // Call the precompile - (bool success, bytes memory resultInMemory) = BLAKE2F_ADDRESS.call{ - value: 0 - }(input); - require(success, "Blake2F precompile call failed"); - - result = resultInMemory; - } - - // Helper function to decode the result from `result` storage - function getResult() public view returns (bytes32[8] memory output) { - bytes memory tempResult = result; - require(tempResult.length == 64, "Invalid result length"); - - for (uint i = 0; i < 8; i++) { - assembly { - mstore(add(output, mul(32, i)), mload(add(add(tempResult, 32), mul(32, i)))) - } - } - } - - - // Helper function to create Blake2F input from parameters - function createBlake2FInput( - uint32 rounds, - bytes32[8] memory h, - bytes32[16] memory m, - bytes8[2] memory t, - bool f - ) public pure returns (bytes memory) { - // Start with rounds (4 bytes, big-endian) - bytes memory input = abi.encodePacked(rounds); - - // Add state vector h (8 * 32 = 256 bytes) - for (uint i = 0; i < 8; i++) { - input = abi.encodePacked(input, h[i]); - } - - // Add message block m (16 * 32 = 512 bytes, but we need to convert to 16 * 8 = 128 bytes) - // Blake2F expects 64-bit words in little-endian format - for (uint i = 0; i < 16; i++) { - // Take only the first 8 bytes of each bytes32 and reverse for little-endian - bytes8 word = bytes8(m[i]); - input = abi.encodePacked(input, word); - } - - // Add offset counters t (2 * 8 = 16 bytes) - input = abi.encodePacked(input, t[0], t[1]); - - // Add final block flag (1 byte) - input = abi.encodePacked(input, f ? bytes1(0x01) : bytes1(0x00)); - - return input; - } - // Simplified function that works with raw hex input - function blake2FFromHex(string memory hexInput) public { - bytes memory input = hexStringToBytes(hexInput); - blake2F(input); - } - - // Helper function to convert hex string to bytes - function hexStringToBytes(string memory hexString) public pure returns (bytes memory) { - bytes memory hexBytes = bytes(hexString); - require(hexBytes.length % 2 == 0, "Invalid hex string length"); - - bytes memory result = new bytes(hexBytes.length / 2); - - for (uint i = 0; i < hexBytes.length / 2; i++) { - result[i] = bytes1( - (hexCharToByte(hexBytes[2 * i]) << 4) | - hexCharToByte(hexBytes[2 * i + 1]) - ); - } - - return result; - } - - function hexCharToByte(bytes1 char) internal pure returns (uint8) { - uint8 c = uint8(char); - if (c >= 48 && c <= 57) return c - 48; // 0-9 - if (c >= 65 && c <= 70) return c - 55; // A-F - if (c >= 97 && c <= 102) return c - 87; // a-f - revert("Invalid hex character"); - } -} ``` To use it, deploy `Blake2FExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `callBlake2F` with the properly formatted input parameters for rounds, state vector, message block, offset counters, and final block flag. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Blake2.js){target=\_blank} demonstrates how to perform Blake2 compression with different rounds and verify the correctness of the output against known test vectors. @@ -7995,16 +7687,7 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - /// The full account information for a particular account ID. - #[pallet::storage] - #[pallet::getter(fn account)] - pub type Account = StorageMap< - _, - Blake2_128Concat, - T::AccountId, - AccountInfo, - ValueQuery, - >; + ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -8028,24 +7711,7 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs -/// Information of an account. -#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] -pub struct AccountInfo { - /// The number of transactions this account has sent. - pub nonce: Nonce, - /// The number of other modules that currently depend on this account's existence. The account - /// cannot be reaped until this is zero. - pub consumers: RefCount, - /// The number of other modules that allow this account to exist. The account may not be reaped - /// until this and `sufficients` are both zero. - pub providers: RefCount, - /// The number of modules that allow this account to exist for their own purposes only. The - /// account may not be reaped until this and `providers` are both zero. - pub sufficients: RefCount, - /// The additional data that belongs to this account. Used to store the balance(s) in a lot of - /// chains. - pub data: AccountData, -} + ``` The `AccountInfo` structure includes the following components: @@ -8756,8 +8422,7 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust -pub type PriceForChildParachainDelivery = - ExponentialPrice; + ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -9579,69 +9244,19 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - decl_test_parachains! { - pub struct AssetHubWestend { - genesis = genesis::genesis(), - on_init = { - asset_hub_westend_runtime::AuraExt::on_initialize(1); - }, - runtime = asset_hub_westend_runtime, - core = { - XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, - LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, - ParachainInfo: asset_hub_westend_runtime::ParachainInfo, - MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, - DigestProvider: (), - }, - pallets = { - PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, - Balances: asset_hub_westend_runtime::Balances, - Assets: asset_hub_westend_runtime::Assets, - ForeignAssets: asset_hub_westend_runtime::ForeignAssets, - PoolAssets: asset_hub_westend_runtime::PoolAssets, - AssetConversion: asset_hub_westend_runtime::AssetConversion, - SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, - Revive: asset_hub_westend_runtime::Revive, - } - }, - } + ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - decl_test_bridges! { - pub struct RococoWestendMockBridge { - source = BridgeHubRococoPara, - target = BridgeHubWestendPara, - handler = RococoWestendMessageHandler - }, - pub struct WestendRococoMockBridge { - source = BridgeHubWestendPara, - target = BridgeHubRococoPara, - handler = WestendRococoMessageHandler - } - } + ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - decl_test_networks! { - pub struct WestendMockNet { - relay_chain = Westend, - parachains = vec![ - AssetHubWestend, - BridgeHubWestend, - CollectivesWestend, - CoretimeWestend, - PeopleWestend, - PenpalA, - PenpalB, - ], - bridge = () - }, - } + ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -13367,7 +12982,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust -fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -13644,7 +13259,7 @@ fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersio This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust -fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -13875,7 +13490,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust -fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; + ``` ??? interface "Input parameters" diff --git a/.ai/categories/tooling.md b/.ai/categories/tooling.md index ea2a85a71..eef78fbc5 100644 --- a/.ai/categories/tooling.md +++ b/.ai/categories/tooling.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - [dependencies] + ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -288,63 +288,13 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] + ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - [workspace.package] - license = "MIT-0" - authors = ["Parity Technologies "] - homepage = "https://paritytech.github.io/polkadot-sdk/" - repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" - edition = "2021" - - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] - resolver = "2" - - [workspace.dependencies] - parachain-template-runtime = { path = "./runtime", default-features = false } - pallet-parachain-template = { path = "./pallets/template", default-features = false } - clap = { version = "4.5.13" } - color-print = { version = "0.3.4" } - docify = { version = "0.2.9" } - futures = { version = "0.3.31" } - jsonrpsee = { version = "0.24.3" } - log = { version = "0.4.22", default-features = false } - polkadot-sdk = { version = "2503.0.1", default-features = false } - prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } - serde = { version = "1.0.214", default-features = false } - codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } - cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } - hex-literal = { version = "0.4.1", default-features = false } - scale-info = { version = "2.11.6", default-features = false } - serde_json = { version = "1.0.132", default-features = false } - smallvec = { version = "1.11.0", default-features = false } - substrate-wasm-builder = { version = "26.0.1", default-features = false } - frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } - - [profile.release] - opt-level = 3 - panic = "unwind" - - [profile.production] - codegen-units = 1 - inherits = "release" - lto = true + ``` @@ -1710,31 +1660,7 @@ npm install ethers@6.13.5 To interact with the Polkadot Hub, you need to set up an [Ethers.js Provider](/develop/smart-contracts/libraries/ethers-js/#set-up-the-ethersjs-provider){target=\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/ethers.js` and add the following code: ```javascript title="app/utils/ethers.js" -import { JsonRpcProvider } from 'ethers'; -export const PASSET_HUB_CONFIG = { - name: 'Passet Hub', - rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io/', // Passet Hub testnet RPC - chainId: 420420422, // Passet Hub testnet chainId - blockExplorer: 'https://blockscout-passet-hub.parity-testnet.parity.io/', -}; - -export const getProvider = () => { - return new JsonRpcProvider(PASSET_HUB_CONFIG.rpc, { - chainId: PASSET_HUB_CONFIG.chainId, - name: PASSET_HUB_CONFIG.name, - }); -}; - -// Helper to get a signer from a provider -export const getSigner = async (provider) => { - if (window.ethereum) { - await window.ethereum.request({ method: 'eth_requestAccounts' }); - const ethersProvider = new ethers.BrowserProvider(window.ethereum); - return ethersProvider.getSigner(); - } - throw new Error('No Ethereum browser provider detected'); -}; ``` This file establishes a connection to the Polkadot Hub TestNet and provides helper functions for obtaining a [Provider](https://docs.ethers.org/v5/api/providers/provider/){target=_blank} and [Signer](https://docs.ethers.org/v5/api/signer/){target=_blank}. The provider allows you to read data from the blockchain, while the signer enables users to send transactions and modify the blockchain state. @@ -1746,55 +1672,13 @@ For this dApp, you'll use a simple Storage contract already deployed. So, you ne ???+ code "Storage.sol ABI" ```json title="abis/Storage.json" - [ - { - "inputs": [ - { - "internalType": "uint256", - "name": "_newNumber", - "type": "uint256" - } - ], - "name": "setNumber", - "outputs": [], - "stateMutability": "nonpayable", - "type": "function" - }, - { - "inputs": [], - "name": "storedNumber", - "outputs": [ - { - "internalType": "uint256", - "name": "", - "type": "uint256" - } - ], - "stateMutability": "view", - "type": "function" - } - ] + ``` Now, create a file called `app/utils/contract.js`: ```javascript title="app/utils/contract.js" -import { Contract } from 'ethers'; -import { getProvider } from './ethers'; -import StorageABI from '../../abis/Storage.json'; - -export const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f'; -export const CONTRACT_ABI = StorageABI; - -export const getContract = () => { - const provider = getProvider(); - return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, provider); -}; - -export const getSignedContract = async (signer) => { - return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, signer); -}; ``` This file defines the contract address, ABI, and functions to create instances of the contract for reading and writing. @@ -1804,167 +1688,7 @@ This file defines the contract address, ABI, and functions to create instances o Next, let's create a component to handle wallet connections. Create a new file called `app/components/WalletConnect.js`: ```javascript title="app/components/WalletConnect.js" -'use client'; - -import React, { useState, useEffect } from 'react'; -import { PASSET_HUB_CONFIG } from '../utils/ethers'; - -const WalletConnect = ({ onConnect }) => { - const [account, setAccount] = useState(null); - const [chainId, setChainId] = useState(null); - const [error, setError] = useState(null); - - useEffect(() => { - // Check if user already has an authorized wallet connection - const checkConnection = async () => { - if (window.ethereum) { - try { - // eth_accounts doesn't trigger the wallet popup - const accounts = await window.ethereum.request({ - method: 'eth_accounts', - }); - if (accounts.length > 0) { - setAccount(accounts[0]); - const chainIdHex = await window.ethereum.request({ - method: 'eth_chainId', - }); - setChainId(parseInt(chainIdHex, 16)); - } - } catch (err) { - console.error('Error checking connection:', err); - setError('Failed to check wallet connection'); - } - } - }; - - checkConnection(); - - if (window.ethereum) { - // Setup wallet event listeners - window.ethereum.on('accountsChanged', (accounts) => { - setAccount(accounts[0] || null); - if (accounts[0] && onConnect) onConnect(accounts[0]); - }); - - window.ethereum.on('chainChanged', (chainIdHex) => { - setChainId(parseInt(chainIdHex, 16)); - }); - } - - return () => { - // Cleanup event listeners - if (window.ethereum) { - window.ethereum.removeListener('accountsChanged', () => {}); - window.ethereum.removeListener('chainChanged', () => {}); - } - }; - }, [onConnect]); - - const connectWallet = async () => { - if (!window.ethereum) { - setError( - 'MetaMask not detected! Please install MetaMask to use this dApp.' - ); - return; - } - - try { - // eth_requestAccounts triggers the wallet popup - const accounts = await window.ethereum.request({ - method: 'eth_requestAccounts', - }); - setAccount(accounts[0]); - - const chainIdHex = await window.ethereum.request({ - method: 'eth_chainId', - }); - const currentChainId = parseInt(chainIdHex, 16); - setChainId(currentChainId); - - // Prompt user to switch networks if needed - if (currentChainId !== PASSET_HUB_CONFIG.chainId) { - await switchNetwork(); - } - - if (onConnect) onConnect(accounts[0]); - } catch (err) { - console.error('Error connecting to wallet:', err); - setError('Failed to connect wallet'); - } - }; - - const switchNetwork = async () => { - try { - await window.ethereum.request({ - method: 'wallet_switchEthereumChain', - params: [{ chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}` }], - }); - } catch (switchError) { - // Error 4902 means the chain hasn't been added to MetaMask - if (switchError.code === 4902) { - try { - await window.ethereum.request({ - method: 'wallet_addEthereumChain', - params: [ - { - chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}`, - chainName: PASSET_HUB_CONFIG.name, - rpcUrls: [PASSET_HUB_CONFIG.rpc], - blockExplorerUrls: [PASSET_HUB_CONFIG.blockExplorer], - }, - ], - }); - } catch (addError) { - setError('Failed to add network to wallet'); - } - } else { - setError('Failed to switch network'); - } - } - }; - - // UI-only disconnection - MetaMask doesn't support programmatic disconnection - const disconnectWallet = () => { - setAccount(null); - }; - - return ( -
- {error &&

{error}

} - - {!account ? ( - - ) : ( -
- - {`${account.substring(0, 6)}...${account.substring(38)}`} - - - {chainId !== PASSET_HUB_CONFIG.chainId && ( - - )} -
- )} -
- ); -}; -export default WalletConnect; ``` This component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. @@ -1973,25 +1697,9 @@ To integrate this component to your dApp, you need to overwrite the existing boi ```javascript title="app/page.js" -import { useState } from 'react'; -import WalletConnect from './components/WalletConnect'; -export default function Home() { - const [account, setAccount] = useState(null); - const handleConnect = (connectedAccount) => { - setAccount(connectedAccount); - }; - return ( -
-

- Ethers.js dApp - Passet Hub Smart Contracts -

- -
- ); -} ``` In your terminal, you can launch your project by running: @@ -2009,64 +1717,7 @@ And you will see the following: Now, let's create a component to read data from the contract. Create a file called `app/components/ReadContract.js`: ```javascript title="app/components/ReadContract.js" -'use client'; - -import React, { useState, useEffect } from 'react'; -import { getContract } from '../utils/contract'; -const ReadContract = () => { - const [storedNumber, setStoredNumber] = useState(null); - const [loading, setLoading] = useState(true); - const [error, setError] = useState(null); - - useEffect(() => { - // Function to read data from the blockchain - const fetchData = async () => { - try { - setLoading(true); - const contract = getContract(); - // Call the smart contract's storedNumber function - const number = await contract.storedNumber(); - setStoredNumber(number.toString()); - setError(null); - } catch (err) { - console.error('Error fetching stored number:', err); - setError('Failed to fetch data from the contract'); - } finally { - setLoading(false); - } - }; - - fetchData(); - - // Poll for updates every 10 seconds to keep UI in sync with blockchain - const interval = setInterval(fetchData, 10000); - - // Clean up interval on component unmount - return () => clearInterval(interval); - }, []); - - return ( -
-

Contract Data

- {loading ? ( -
-
-
- ) : error ? ( -

{error}

- ) : ( -
-

- Stored Number: {storedNumber} -

-
- )} -
- ); -}; - -export default ReadContract; ``` This component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically. @@ -2075,27 +1726,9 @@ To see this change in your dApp, you need to integrate this component into the ` ```javascript title="app/page.js" -import { useState } from 'react'; -import WalletConnect from './components/WalletConnect'; -import ReadContract from './components/ReadContract'; -export default function Home() { - const [account, setAccount] = useState(null); - const handleConnect = (connectedAccount) => { - setAccount(connectedAccount); - }; - return ( -
-

- Ethers.js dApp - Passet Hub Smart Contracts -

- - -
- ); -} ``` Your dApp will automatically be updated to the following: @@ -2107,119 +1740,7 @@ Your dApp will automatically be updated to the following: Finally, let's create a component that allows users to update the stored number. Create a file called `app/components/WriteContract.js`: ```javascript title="app/components/WriteContract.js" -'use client'; - -import { useState } from 'react'; -import { getSignedContract } from '../utils/contract'; -import { ethers } from 'ethers'; - -const WriteContract = ({ account }) => { - const [newNumber, setNewNumber] = useState(''); - const [status, setStatus] = useState({ type: null, message: '' }); - const [isSubmitting, setIsSubmitting] = useState(false); - - const handleSubmit = async (e) => { - e.preventDefault(); - // Validation checks - if (!account) { - setStatus({ type: 'error', message: 'Please connect your wallet first' }); - return; - } - - if (!newNumber || isNaN(Number(newNumber))) { - setStatus({ type: 'error', message: 'Please enter a valid number' }); - return; - } - - try { - setIsSubmitting(true); - setStatus({ type: 'info', message: 'Initiating transaction...' }); - - // Get a signer from the connected wallet - const provider = new ethers.BrowserProvider(window.ethereum); - const signer = await provider.getSigner(); - const contract = await getSignedContract(signer); - - // Send transaction to blockchain and wait for user confirmation in wallet - setStatus({ - type: 'info', - message: 'Please confirm the transaction in your wallet...', - }); - - // Call the contract's setNumber function - const tx = await contract.setNumber(newNumber); - - // Wait for transaction to be mined - setStatus({ - type: 'info', - message: 'Transaction submitted. Waiting for confirmation...', - }); - const receipt = await tx.wait(); - - setStatus({ - type: 'success', - message: `Transaction confirmed! Transaction hash: ${receipt.hash}`, - }); - setNewNumber(''); - } catch (err) { - console.error('Error updating number:', err); - - // Error code 4001 is MetaMask's code for user rejection - if (err.code === 4001) { - setStatus({ type: 'error', message: 'Transaction rejected by user.' }); - } else { - setStatus({ - type: 'error', - message: `Error: ${err.message || 'Failed to send transaction'}`, - }); - } - } finally { - setIsSubmitting(false); - } - }; - - return ( -
-

Update Stored Number

- {status.message && ( -
- {status.message} -
- )} -
- setNewNumber(e.target.value)} - disabled={isSubmitting || !account} - className="w-full p-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-pink-400" - /> - -
- {!account && ( -

- Connect your wallet to update the stored number. -

- )} -
- ); -}; - -export default WriteContract; ``` This component allows users to input a new number and send a transaction to update the value stored in the contract. When the transaction is successful, users will see the stored value update in the `ReadContract` component after the transaction is confirmed. @@ -2227,32 +1748,7 @@ This component allows users to input a new number and send a transaction to upda Update the `app/page.js` file to integrate all components: ```javascript title="app/page.js" -'use client'; - -import { useState } from 'react'; - -import WalletConnect from './components/WalletConnect'; -import ReadContract from './components/ReadContract'; -import WriteContract from './components/WriteContract'; - -export default function Home() { - const [account, setAccount] = useState(null); - - const handleConnect = (connectedAccount) => { - setAccount(connectedAccount); - }; - return ( -
-

- Ethers.js dApp - Passet Hub Smart Contracts -

- - - -
- ); -} ``` The completed UI will display: @@ -2357,47 +1853,7 @@ npm install --save-dev typescript @types/node To interact with Polkadot Hub, you need to set up a [Public Client](https://viem.sh/docs/clients/public#public-client){target=\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/viem.ts` and add the following code: ```typescript title="viem.ts" -import { createPublicClient, http, createWalletClient, custom } from 'viem' -import 'viem/window'; - -const transport = http('https://testnet-passet-hub-eth-rpc.polkadot.io') - -// Configure the Passet Hub chain -export const passetHub = { - id: 420420422, - name: 'Passet Hub', - network: 'passet-hub', - nativeCurrency: { - decimals: 18, - name: 'PAS', - symbol: 'PAS', - }, - rpcUrls: { - default: { - http: ['https://testnet-passet-hub-eth-rpc.polkadot.io'], - }, - }, -} as const - -// Create a public client for reading data -export const publicClient = createPublicClient({ - chain: passetHub, - transport -}) - -// Create a wallet client for signing transactions -export const getWalletClient = async () => { - if (typeof window !== 'undefined' && window.ethereum) { - const [account] = await window.ethereum.request({ method: 'eth_requestAccounts' }); - return createWalletClient({ - chain: passetHub, - transport: custom(window.ethereum), - account, - }); - } - throw new Error('No Ethereum browser provider detected'); -}; ``` This file initializes a viem client, providing helper functions for obtaining a Public Client and a [Wallet Client](https://viem.sh/docs/clients/wallet#wallet-client){target=\_blank}. The Public Client enables reading blockchain data, while the Wallet Client allows users to sign and send transactions. Also, note that by importing `'viem/window'` the global `window.ethereum` will be typed as an `EIP1193Provider`, check the [`window` Polyfill](https://viem.sh/docs/typescript#window-polyfill){target=\_blank} reference for more information. @@ -2443,31 +1899,7 @@ Create a folder called `abis` at the root of your project, then create a file na Next, create a file called `utils/contract.ts`: ```typescript title="contract.ts" -import { getContract } from 'viem'; -import { publicClient, getWalletClient } from './viem'; -import StorageABI from '../../abis/Storage.json'; - -export const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f'; -export const CONTRACT_ABI = StorageABI; -// Create a function to get a contract instance for reading -export const getContractInstance = () => { - return getContract({ - address: CONTRACT_ADDRESS, - abi: CONTRACT_ABI, - client: publicClient, - }); -}; - -// Create a function to get a contract instance with a signer for writing -export const getSignedContract = async () => { - const walletClient = await getWalletClient(); - return getContract({ - address: CONTRACT_ADDRESS, - abi: CONTRACT_ABI, - client: walletClient, - }); -}; ``` This file defines the contract address, ABI, and functions to create a viem [contract instance](https://viem.sh/docs/contract/getContract#contract-instances){target=\_blank} for reading and writing operations. viem's contract utilities ensure a more efficient and type-safe interaction with smart contracts. @@ -2477,180 +1909,7 @@ This file defines the contract address, ABI, and functions to create a viem [con Now, let's create a component to handle wallet connections. Create a new file called `components/WalletConnect.tsx`: ```typescript title="WalletConnect.tsx" -"use client"; - -import React, { useState, useEffect } from "react"; -import { passetHub } from "../utils/viem"; - -interface WalletConnectProps { - onConnect: (account: string) => void; -} - -const WalletConnect: React.FC = ({ onConnect }) => { - const [account, setAccount] = useState(null); - const [chainId, setChainId] = useState(null); - const [error, setError] = useState(null); - - useEffect(() => { - // Check if user already has an authorized wallet connection - const checkConnection = async () => { - if (typeof window !== 'undefined' && window.ethereum) { - try { - // eth_accounts doesn't trigger the wallet popup - const accounts = await window.ethereum.request({ - method: 'eth_accounts', - }) as string[]; - - if (accounts.length > 0) { - setAccount(accounts[0]); - const chainIdHex = await window.ethereum.request({ - method: 'eth_chainId', - }) as string; - setChainId(parseInt(chainIdHex, 16)); - onConnect(accounts[0]); - } - } catch (err) { - console.error('Error checking connection:', err); - setError('Failed to check wallet connection'); - } - } - }; - - checkConnection(); - - if (typeof window !== 'undefined' && window.ethereum) { - // Setup wallet event listeners - window.ethereum.on('accountsChanged', (accounts: string[]) => { - setAccount(accounts[0] || null); - if (accounts[0]) onConnect(accounts[0]); - }); - - window.ethereum.on('chainChanged', (chainIdHex: string) => { - setChainId(parseInt(chainIdHex, 16)); - }); - } - - return () => { - // Cleanup event listeners - if (typeof window !== 'undefined' && window.ethereum) { - window.ethereum.removeListener('accountsChanged', () => {}); - window.ethereum.removeListener('chainChanged', () => {}); - } - }; - }, [onConnect]); - - const connectWallet = async () => { - if (typeof window === 'undefined' || !window.ethereum) { - setError( - 'MetaMask not detected! Please install MetaMask to use this dApp.' - ); - return; - } - - try { - // eth_requestAccounts triggers the wallet popup - const accounts = await window.ethereum.request({ - method: 'eth_requestAccounts', - }) as string[]; - - setAccount(accounts[0]); - - const chainIdHex = await window.ethereum.request({ - method: 'eth_chainId', - }) as string; - - const currentChainId = parseInt(chainIdHex, 16); - setChainId(currentChainId); - - // Prompt user to switch networks if needed - if (currentChainId !== passetHub.id) { - await switchNetwork(); - } - - onConnect(accounts[0]); - } catch (err) { - console.error('Error connecting to wallet:', err); - setError('Failed to connect wallet'); - } - }; - const switchNetwork = async () => { - console.log('Switch network') - try { - await window.ethereum.request({ - method: 'wallet_switchEthereumChain', - params: [{ chainId: `0x${passetHub.id.toString(16)}` }], - }); - } catch (switchError: any) { - // Error 4902 means the chain hasn't been added to MetaMask - if (switchError.code === 4902) { - try { - await window.ethereum.request({ - method: 'wallet_addEthereumChain', - params: [ - { - chainId: `0x${passetHub.id.toString(16)}`, - chainName: passetHub.name, - rpcUrls: [passetHub.rpcUrls.default.http[0]], - nativeCurrency: { - name: passetHub.nativeCurrency.name, - symbol: passetHub.nativeCurrency.symbol, - decimals: passetHub.nativeCurrency.decimals, - }, - }, - ], - }); - } catch (addError) { - setError('Failed to add network to wallet'); - } - } else { - setError('Failed to switch network'); - } - } - }; - - // UI-only disconnection - MetaMask doesn't support programmatic disconnection - const disconnectWallet = () => { - setAccount(null); - }; - - return ( -
- {error &&

{error}

} - - {!account ? ( - - ) : ( -
- - {`${account.substring(0, 6)}...${account.substring(38)}`} - - - {chainId !== passetHub.id && ( - - )} -
- )} -
- ); -}; - -export default WalletConnect; ``` This component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. It provides a button for users to connect their wallet and displays the connected account address once connected. @@ -2659,24 +1918,9 @@ To use this component in your dApp, replace the existing boilerplate in `app/pag ```typescript title="page.tsx" -import { useState } from "react"; -import WalletConnect from "./components/WalletConnect"; -export default function Home() { - const [account, setAccount] = useState(null); - const handleConnect = (connectedAccount: string) => { - setAccount(connectedAccount); - }; - return ( -
-

- Viem dApp - Passet Hub Smart Contracts -

- -
- ); -} + ``` Now you're ready to run your dApp. From your project directory, execute: @@ -2694,70 +1938,7 @@ Navigate to `http://localhost:3000` in your browser, and you should see your dAp Now, let's create a component to read data from the contract. Create a file called `components/ReadContract.tsx`: ```typescript title="ReadContract.tsx" -'use client'; - -import React, { useState, useEffect } from 'react'; -import { publicClient } from '../utils/viem'; -import { CONTRACT_ADDRESS, CONTRACT_ABI } from '../utils/contract'; - -const ReadContract: React.FC = () => { - const [storedNumber, setStoredNumber] = useState(null); - const [loading, setLoading] = useState(true); - const [error, setError] = useState(null); - - useEffect(() => { - // Function to read data from the blockchain - const fetchData = async () => { - try { - setLoading(true); - // Call the smart contract's storedNumber function - const number = await publicClient.readContract({ - address: CONTRACT_ADDRESS, - abi: CONTRACT_ABI, - functionName: 'storedNumber', - args: [], - }) as bigint; - - setStoredNumber(number.toString()); - setError(null); - } catch (err) { - console.error('Error fetching stored number:', err); - setError('Failed to fetch data from the contract'); - } finally { - setLoading(false); - } - }; - - fetchData(); - // Poll for updates every 10 seconds to keep UI in sync with blockchain - const interval = setInterval(fetchData, 10000); - - // Clean up interval on component unmount - return () => clearInterval(interval); - }, []); - - return ( -
-

Contract Data

- {loading ? ( -
-
-
- ) : error ? ( -

{error}

- ) : ( -
-

- Stored Number: {storedNumber} -

-
- )} -
- ); -}; - -export default ReadContract; ``` This component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically, ensuring that the UI stays in sync with the blockchain state. @@ -2766,26 +1947,9 @@ To reflect this change in your dApp, incorporate this component into the `app/pa ```typescript title="page.tsx" -import { useState } from "react"; -import WalletConnect from "./components/WalletConnect"; -import ReadContract from "./components/ReadContract"; -export default function Home() { - const [account, setAccount] = useState(null); - const handleConnect = (connectedAccount: string) => { - setAccount(connectedAccount); - }; - return ( -
-

- Viem dApp - Passet Hub Smart Contracts -

- - -
- ); -} + ``` And you will see in your browser: @@ -3021,31 +2185,7 @@ This component allows users to input a new number and send a transaction to upda Update the `app/page.tsx` file to integrate all components: ```typescript title="page.tsx" -"use client"; - -import { useState } from "react"; -import WalletConnect from "./components/WalletConnect"; -import ReadContract from "./components/ReadContract"; -import WriteContract from "./components/WriteContract"; - -export default function Home() { - const [account, setAccount] = useState(null); - const handleConnect = (connectedAccount: string) => { - setAccount(connectedAccount); - }; - - return ( -
-

- Viem dApp - Passet Hub Smart Contracts -

- - - -
- ); -} ``` After that, you will see: @@ -3203,53 +2343,13 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ## Understanding the Code @@ -15909,16 +15009,7 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - /// The full account information for a particular account ID. - #[pallet::storage] - #[pallet::getter(fn account)] - pub type Account = StorageMap< - _, - Blake2_128Concat, - T::AccountId, - AccountInfo, - ValueQuery, - >; + ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -15942,24 +15033,7 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs -/// Information of an account. -#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] -pub struct AccountInfo { - /// The number of transactions this account has sent. - pub nonce: Nonce, - /// The number of other modules that currently depend on this account's existence. The account - /// cannot be reaped until this is zero. - pub consumers: RefCount, - /// The number of other modules that allow this account to exist. The account may not be reaped - /// until this and `sufficients` are both zero. - pub providers: RefCount, - /// The number of modules that allow this account to exist for their own purposes only. The - /// account may not be reaped until this and `providers` are both zero. - pub sufficients: RefCount, - /// The additional data that belongs to this account. Used to store the balance(s) in a lot of - /// chains. - pub data: AccountData, -} + ``` The `AccountInfo` structure includes the following components: @@ -17304,8 +16378,7 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust -pub type PriceForChildParachainDelivery = - ExponentialPrice; + ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -18480,42 +17553,7 @@ Let's start by setting up Hardhat for your Storage contract project: 6. Configure Hardhat by updating the `hardhat.config.js` file: ```javascript title="hardhat.config.js" - require("@nomicfoundation/hardhat-toolbox"); - - require("@parity/hardhat-polkadot"); - - const { vars } = require("hardhat/config"); - - /** @type import('hardhat/config').HardhatUserConfig */ - module.exports = { - solidity: "0.8.28", - resolc: { - compilerSource: "npm", - }, - networks: { - hardhat: { - polkavm: true, - nodeConfig: { - nodeBinaryPath: 'INSERT_PATH_TO_SUBSTRATE_NODE', - rpcPort: 8000, - dev: true, - }, - adapterConfig: { - adapterBinaryPath: 'INSERT_PATH_TO_ETH_RPC_ADAPTER', - dev: true, - }, - }, - localNode: { - polkavm: true, - url: `http://127.0.0.1:8545`, - }, - passetHub: { - polkavm: true, - url: 'https://testnet-passet-hub-eth-rpc.polkadot.io', - accounts: [vars.get("PRIVATE_KEY")], - }, - }, - }; + ``` Ensure that `INSERT_PATH_TO_SUBSTRATE_NODE` and `INSERT_PATH_TO_ETH_RPC_ADAPTER` are replaced with the proper paths to the compiled binaries. @@ -18769,13 +17807,7 @@ Testing is a critical part of smart contract development. Hardhat makes it easy 1. Create a new folder called`ignition/modules`. Add a new file named `StorageModule.js` with the following logic: ```javascript title="StorageModule.js" - const { buildModule } = require('@nomicfoundation/hardhat-ignition/modules'); - - module.exports = buildModule('StorageModule', (m) => { - const storage = m.contract('Storage'); - - return { storage }; - }); + ``` 2. Deploy to the local network: @@ -18993,69 +18025,19 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - decl_test_parachains! { - pub struct AssetHubWestend { - genesis = genesis::genesis(), - on_init = { - asset_hub_westend_runtime::AuraExt::on_initialize(1); - }, - runtime = asset_hub_westend_runtime, - core = { - XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, - LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, - ParachainInfo: asset_hub_westend_runtime::ParachainInfo, - MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, - DigestProvider: (), - }, - pallets = { - PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, - Balances: asset_hub_westend_runtime::Balances, - Assets: asset_hub_westend_runtime::Assets, - ForeignAssets: asset_hub_westend_runtime::ForeignAssets, - PoolAssets: asset_hub_westend_runtime::PoolAssets, - AssetConversion: asset_hub_westend_runtime::AssetConversion, - SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, - Revive: asset_hub_westend_runtime::Revive, - } - }, - } + ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - decl_test_bridges! { - pub struct RococoWestendMockBridge { - source = BridgeHubRococoPara, - target = BridgeHubWestendPara, - handler = RococoWestendMessageHandler - }, - pub struct WestendRococoMockBridge { - source = BridgeHubWestendPara, - target = BridgeHubRococoPara, - handler = WestendRococoMessageHandler - } - } + ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - decl_test_networks! { - pub struct WestendMockNet { - relay_chain = Westend, - parachains = vec![ - AssetHubWestend, - BridgeHubWestend, - CollectivesWestend, - CoretimeWestend, - PeopleWestend, - PenpalA, - PenpalB, - ], - bridge = () - }, - } + ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -23208,7 +22190,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust -fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -23485,7 +22467,7 @@ fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersio This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust -fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -23716,7 +22698,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust -fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; + ``` ??? interface "Input parameters" diff --git a/.ai/pages/develop-interoperability-send-messages.md b/.ai/pages/develop-interoperability-send-messages.md index 57fbb8be0..ac760f8db 100644 --- a/.ai/pages/develop-interoperability-send-messages.md +++ b/.ai/pages/develop-interoperability-send-messages.md @@ -84,8 +84,7 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust -pub type PriceForChildParachainDelivery = - ExponentialPrice; + ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. diff --git a/.ai/pages/develop-interoperability-test-and-debug.md b/.ai/pages/develop-interoperability-test-and-debug.md index 311ca2f26..8d898ddde 100644 --- a/.ai/pages/develop-interoperability-test-and-debug.md +++ b/.ai/pages/develop-interoperability-test-and-debug.md @@ -77,69 +77,19 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - decl_test_parachains! { - pub struct AssetHubWestend { - genesis = genesis::genesis(), - on_init = { - asset_hub_westend_runtime::AuraExt::on_initialize(1); - }, - runtime = asset_hub_westend_runtime, - core = { - XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, - LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, - ParachainInfo: asset_hub_westend_runtime::ParachainInfo, - MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, - DigestProvider: (), - }, - pallets = { - PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, - Balances: asset_hub_westend_runtime::Balances, - Assets: asset_hub_westend_runtime::Assets, - ForeignAssets: asset_hub_westend_runtime::ForeignAssets, - PoolAssets: asset_hub_westend_runtime::PoolAssets, - AssetConversion: asset_hub_westend_runtime::AssetConversion, - SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, - Revive: asset_hub_westend_runtime::Revive, - } - }, - } + ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - decl_test_bridges! { - pub struct RococoWestendMockBridge { - source = BridgeHubRococoPara, - target = BridgeHubWestendPara, - handler = RococoWestendMessageHandler - }, - pub struct WestendRococoMockBridge { - source = BridgeHubWestendPara, - target = BridgeHubRococoPara, - handler = WestendRococoMessageHandler - } - } + ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - decl_test_networks! { - pub struct WestendMockNet { - relay_chain = Westend, - parachains = vec![ - AssetHubWestend, - BridgeHubWestend, - CollectivesWestend, - CoretimeWestend, - PeopleWestend, - PenpalA, - PenpalB, - ], - bridge = () - }, - } + ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. diff --git a/.ai/pages/develop-interoperability-xcm-guides.md b/.ai/pages/develop-interoperability-xcm-guides.md index 0b3833a56..a14d646ae 100644 --- a/.ai/pages/develop-interoperability-xcm-guides.md +++ b/.ai/pages/develop-interoperability-xcm-guides.md @@ -20,28 +20,24 @@ Whether you're building applications that need to interact with multiple chains, diff --git a/.ai/pages/develop-interoperability-xcm-runtime-apis.md b/.ai/pages/develop-interoperability-xcm-runtime-apis.md index 0b676f353..6d761bebd 100644 --- a/.ai/pages/develop-interoperability-xcm-runtime-apis.md +++ b/.ai/pages/develop-interoperability-xcm-runtime-apis.md @@ -38,7 +38,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust -fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -315,7 +315,7 @@ fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersio This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust -fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; + ``` ??? interface "Input parameters" @@ -546,7 +546,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust -fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; + ``` ??? interface "Input parameters" diff --git a/.ai/pages/develop-interoperability.md b/.ai/pages/develop-interoperability.md index 4325912a3..49b1616b5 100644 --- a/.ai/pages/develop-interoperability.md +++ b/.ai/pages/develop-interoperability.md @@ -21,28 +21,24 @@ This section covers everything you need to know about building and implementing diff --git a/.ai/pages/develop-parachains-customize-parachain.md b/.ai/pages/develop-parachains-customize-parachain.md index 86fdf0835..0c37b49c4 100644 --- a/.ai/pages/develop-parachains-customize-parachain.md +++ b/.ai/pages/develop-parachains-customize-parachain.md @@ -20,21 +20,18 @@ The [FRAME directory](https://github.com/paritytech/polkadot-sdk/tree/polkadot-s diff --git a/.ai/pages/develop-parachains-deployment-build-deterministic-runtime.md b/.ai/pages/develop-parachains-deployment-build-deterministic-runtime.md index b95195847..1f9b43dbb 100644 --- a/.ai/pages/develop-parachains-deployment-build-deterministic-runtime.md +++ b/.ai/pages/develop-parachains-deployment-build-deterministic-runtime.md @@ -107,29 +107,7 @@ To add a GitHub workflow for building the runtime: {% raw %} ```yml - name: Srtool build - - on: push - - jobs: - srtool: - runs-on: ubuntu-latest - strategy: - matrix: - chain: ["asset-hub-kusama", "asset-hub-westend"] - steps: - - uses: actions/checkout@v3 - - name: Srtool build - id: srtool_build - uses: chevdor/srtool-actions@v0.8.0 - with: - chain: ${{ matrix.chain }} - runtime_dir: polkadot-parachains/${{ matrix.chain }}-runtime - - name: Summary - run: | - echo '${{ steps.srtool_build.outputs.json }}' | jq . > ${{ matrix.chain }}-srtool-digest.json - cat ${{ matrix.chain }}-srtool-digest.json - echo "Runtime location: ${{ steps.srtool_build.outputs.wasm }}" + ``` {% endraw %} diff --git a/.ai/pages/develop-parachains-deployment.md b/.ai/pages/develop-parachains-deployment.md index bd9dae5d0..6cb349454 100644 --- a/.ai/pages/develop-parachains-deployment.md +++ b/.ai/pages/develop-parachains-deployment.md @@ -71,7 +71,6 @@ flowchart TD diff --git a/.ai/pages/develop-parachains-maintenance-storage-migrations.md b/.ai/pages/develop-parachains-maintenance-storage-migrations.md index 60d3fe6a8..ded3b70e9 100644 --- a/.ai/pages/develop-parachains-maintenance-storage-migrations.md +++ b/.ai/pages/develop-parachains-maintenance-storage-migrations.md @@ -170,111 +170,7 @@ Examine the following migration example that transforms a simple `StorageValue` - Migration: ```rust - use frame_support::{ - storage_alias, - traits::{Get, UncheckedOnRuntimeUpgrade}, - }; - - #[cfg(feature = "try-runtime")] - use alloc::vec::Vec; - - /// Collection of storage item formats from the previous storage version. - /// - /// Required so we can read values in the v0 storage format during the migration. - mod v0 { - use super::*; - - /// V0 type for [`crate::Value`]. - #[storage_alias] - pub type Value = StorageValue, u32>; - } - - /// Implements [`UncheckedOnRuntimeUpgrade`], migrating the state of this pallet from V0 to V1. - /// - /// In V0 of the template [`crate::Value`] is just a `u32`. In V1, it has been upgraded to - /// contain the struct [`crate::CurrentAndPreviousValue`]. - /// - /// In this migration, update the on-chain storage for the pallet to reflect the new storage - /// layout. - pub struct InnerMigrateV0ToV1(core::marker::PhantomData); - - impl UncheckedOnRuntimeUpgrade for InnerMigrateV0ToV1 { - /// Return the existing [`crate::Value`] so we can check that it was correctly set in - /// `InnerMigrateV0ToV1::post_upgrade`. - #[cfg(feature = "try-runtime")] - fn pre_upgrade() -> Result, sp_runtime::TryRuntimeError> { - use codec::Encode; - - // Access the old value using the `storage_alias` type - let old_value = v0::Value::::get(); - // Return it as an encoded `Vec` - Ok(old_value.encode()) - } - - /// Migrate the storage from V0 to V1. - /// - /// - If the value doesn't exist, there is nothing to do. - /// - If the value exists, it is read and then written back to storage inside a - /// [`crate::CurrentAndPreviousValue`]. - fn on_runtime_upgrade() -> frame_support::weights::Weight { - // Read the old value from storage - if let Some(old_value) = v0::Value::::take() { - // Write the new value to storage - let new = crate::CurrentAndPreviousValue { current: old_value, previous: None }; - crate::Value::::put(new); - // One read + write for taking the old value, and one write for setting the new value - T::DbWeight::get().reads_writes(1, 2) - } else { - // No writes since there was no old value, just one read for checking - T::DbWeight::get().reads(1) - } - } - - /// Verifies the storage was migrated correctly. - /// - /// - If there was no old value, the new value should not be set. - /// - If there was an old value, the new value should be a [`crate::CurrentAndPreviousValue`]. - #[cfg(feature = "try-runtime")] - fn post_upgrade(state: Vec) -> Result<(), sp_runtime::TryRuntimeError> { - use codec::Decode; - use frame_support::ensure; - - let maybe_old_value = Option::::decode(&mut &state[..]).map_err(|_| { - sp_runtime::TryRuntimeError::Other("Failed to decode old value from storage") - })?; - - match maybe_old_value { - Some(old_value) => { - let expected_new_value = - crate::CurrentAndPreviousValue { current: old_value, previous: None }; - let actual_new_value = crate::Value::::get(); - - ensure!(actual_new_value.is_some(), "New value not set"); - ensure!( - actual_new_value == Some(expected_new_value), - "New value not set correctly" - ); - }, - None => { - ensure!(crate::Value::::get().is_none(), "New value unexpectedly set"); - }, - }; - Ok(()) - } - } - - /// [`UncheckedOnRuntimeUpgrade`] implementation [`InnerMigrateV0ToV1`] wrapped in a - /// [`VersionedMigration`](frame_support::migrations::VersionedMigration), which ensures that: - /// - The migration only runs once when the on-chain storage version is 0 - /// - The on-chain storage version is updated to `1` after the migration executes - /// - Reads/Writes from checking/settings the on-chain storage version are accounted for - pub type MigrateV0ToV1 = frame_support::migrations::VersionedMigration< - 0, // The migration will only execute when the on-chain storage version is 0 - 1, // The on-chain storage version will be set to 1 after the migration is complete - InnerMigrateV0ToV1, - crate::pallet::Pallet, - ::DbWeight, - >; + ``` ### Migration Organization diff --git a/.ai/pages/develop-parachains-maintenance.md b/.ai/pages/develop-parachains-maintenance.md index 1cc2392b7..4949e2b06 100644 --- a/.ai/pages/develop-parachains-maintenance.md +++ b/.ai/pages/develop-parachains-maintenance.md @@ -18,14 +18,12 @@ Learn how to maintain Polkadot SDK-based networks, focusing on runtime monitorin diff --git a/.ai/pages/develop-parachains-testing-benchmarking.md b/.ai/pages/develop-parachains-testing-benchmarking.md index 0cff99c05..b56d8b6b3 100644 --- a/.ai/pages/develop-parachains-testing-benchmarking.md +++ b/.ai/pages/develop-parachains-testing-benchmarking.md @@ -104,40 +104,7 @@ my-pallet/ With the directory structure set, you can use the [`polkadot-sdk-parachain-template`](https://github.com/paritytech/polkadot-sdk-parachain-template/tree/master/pallets){target=\_blank} to get started as follows: ```rust title="benchmarking.rs (starter template)" -//! Benchmarking setup for pallet-template -#![cfg(feature = "runtime-benchmarks")] - -use super::*; -use frame_benchmarking::v2::*; - -#[benchmarks] -mod benchmarks { - use super::*; - #[cfg(test)] - use crate::pallet::Pallet as Template; - use frame_system::RawOrigin; - - #[benchmark] - fn do_something() { - let caller: T::AccountId = whitelisted_caller(); - #[extrinsic_call] - do_something(RawOrigin::Signed(caller), 100); - - assert_eq!(Something::::get().map(|v| v.block_number), Some(100u32.into())); - } - - #[benchmark] - fn cause_error() { - Something::::put(CompositeStruct { block_number: 100u32.into() }); - let caller: T::AccountId = whitelisted_caller(); - #[extrinsic_call] - cause_error(RawOrigin::Signed(caller)); - - assert_eq!(Something::::get().map(|v| v.block_number), Some(101u32.into())); - } - - impl_benchmark_test_suite!(Template, crate::mock::new_test_ext(), crate::mock::Test); -} + ``` In your benchmarking tests, employ these best practices: diff --git a/.ai/pages/develop-parachains-testing.md b/.ai/pages/develop-parachains-testing.md index 45fc04202..350d47b02 100644 --- a/.ai/pages/develop-parachains-testing.md +++ b/.ai/pages/develop-parachains-testing.md @@ -25,14 +25,12 @@ Through these guides, you'll learn to: diff --git a/.ai/pages/develop-smart-contracts-precompiles-interact-with-precompiles.md b/.ai/pages/develop-smart-contracts-precompiles-interact-with-precompiles.md index 2c14630c4..97dd3b443 100644 --- a/.ai/pages/develop-smart-contracts-precompiles-interact-with-precompiles.md +++ b/.ai/pages/develop-smart-contracts-precompiles-interact-with-precompiles.md @@ -81,30 +81,7 @@ To interact with the ECRecover precompile, you can deploy the `ECRecoverExample` The SHA-256 precompile computes the SHA-256 hash of the input data. ```solidity title="SHA256.sol" -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.0; - -contract SHA256Example { - event SHA256Called(bytes result); - - // Address of the SHA256 precompile - address constant SHA256_PRECOMPILE = address(0x02); - - bytes public result; - - function callH256(bytes calldata input) public { - bool success; - bytes memory resultInMemory; - - (success, resultInMemory) = SHA256_PRECOMPILE.call{value: 0}(input); - - if (success) { - emit SHA256Called(resultInMemory); - } - result = resultInMemory; - } -} ``` To use it, you can deploy the `SHA256Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call callH256 with arbitrary bytes. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/SHA256.js){target=\_blank} shows how to pass a UTF-8 string, hash it using the precompile, and compare it with the expected hash from Node.js's [crypto](https://www.npmjs.com/package/crypto-js){target=\_blank} module. @@ -221,38 +198,7 @@ To use it, you can deploy the `ModExpExample` contract in [Remix](/develop/smart The BN128Add precompile performs addition on the alt_bn128 elliptic curve, which is essential for zk-SNARK operations. ```solidity title="BN128Add.sol" -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.20; - -contract BN128AddExample { - address constant BN128_ADD_PRECOMPILE = address(0x06); - event BN128Added(uint256 x3, uint256 y3); - - uint256 public resultX; - uint256 public resultY; - - function callBN128Add(uint256 x1, uint256 y1, uint256 x2, uint256 y2) public { - bytes memory input = abi.encodePacked( - bytes32(x1), bytes32(y1), bytes32(x2), bytes32(y2) - ); - - bool success; - bytes memory output; - - (success, output) = BN128_ADD_PRECOMPILE.call{value: 0}(input); - - require(success, "BN128Add precompile call failed"); - require(output.length == 64, "Invalid output length"); - - (uint256 x3, uint256 y3) = abi.decode(output, (uint256, uint256)); - - resultX = x3; - resultY = y3; - - emit BN128Added(x3, y3); - } -} ``` To use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `callBN128Add` with valid `alt_bn128` points. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Add.js){target=\_blank} demonstrates a valid curve addition and checks the result against known expected values. @@ -262,42 +208,7 @@ To use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/sma The BN128Mul precompile performs scalar multiplication on the alt_bn128 curve. ```solidity title="BN128Mul.sol" -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.0; - -contract BN128MulExample { - // Precompile address for BN128Mul - address constant BN128_MUL_ADDRESS = address(0x07); - - bytes public result; - // Performs scalar multiplication of a point on the alt_bn128 curve - function bn128ScalarMul(uint256 x1, uint256 y1, uint256 scalar) public { - // Format: [x, y, scalar] - each 32 bytes - bytes memory input = abi.encodePacked( - bytes32(x1), - bytes32(y1), - bytes32(scalar) - ); - - (bool success, bytes memory resultInMemory) = BN128_MUL_ADDRESS.call{ - value: 0 - }(input); - require(success, "BN128Mul precompile call failed"); - - result = resultInMemory; - } - - // Helper to decode result from `result` storage - function getResult() public view returns (uint256 x2, uint256 y2) { - bytes memory tempResult = result; - require(tempResult.length >= 64, "Invalid result length"); - assembly { - x2 := mload(add(tempResult, 32)) - y2 := mload(add(tempResult, 64)) - } - } -} ``` To use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `bn128ScalarMul` with a valid point and scalar. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Mul.js){target=\_blank} shows how to test the operation and verify the expected scalar multiplication result on `alt_bn128`. @@ -307,38 +218,7 @@ To use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-envi The BN128Pairing precompile verifies a pairing equation on the alt_bn128 curve, which is critical for zk-SNARK verification. ```solidity title="BN128Pairing.sol" -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.0; - -contract BN128PairingExample { - // Precompile address for BN128Pairing - address constant BN128_PAIRING_ADDRESS = address(0x08); - - bytes public result; - - // Performs a pairing check on the alt_bn128 curve - function bn128Pairing(bytes memory input) public { - // Call the precompile - (bool success, bytes memory resultInMemory) = BN128_PAIRING_ADDRESS - .call{value: 0}(input); - require(success, "BN128Pairing precompile call failed"); - result = resultInMemory; - } - - // Helper function to decode the result from `result` storage - function getResult() public view returns (bool isValid) { - bytes memory tempResult = result; - require(tempResult.length == 32, "Invalid result length"); - - uint256 output; - assembly { - output := mload(add(tempResult, 32)) - } - - isValid = (output == 1); - } -} ``` You can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or your preferred environment. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Pairing.js){target=\_blank} contains these tests with working examples. @@ -348,105 +228,7 @@ You can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-env The Blake2F precompile performs the Blake2 compression function F, which is the core of the Blake2 hash function. ```solidity title="Blake2F.sol" -// SPDX-License-Identifier: MIT -pragma solidity ^0.8.0; - -contract Blake2FExample { - // Precompile address for Blake2F - address constant BLAKE2F_ADDRESS = address(0x09); - - bytes public result; - - function blake2F(bytes memory input) public { - // Input must be exactly 213 bytes - require(input.length == 213, "Invalid input length - must be 213 bytes"); - - // Call the precompile - (bool success, bytes memory resultInMemory) = BLAKE2F_ADDRESS.call{ - value: 0 - }(input); - require(success, "Blake2F precompile call failed"); - - result = resultInMemory; - } - - // Helper function to decode the result from `result` storage - function getResult() public view returns (bytes32[8] memory output) { - bytes memory tempResult = result; - require(tempResult.length == 64, "Invalid result length"); - - for (uint i = 0; i < 8; i++) { - assembly { - mstore(add(output, mul(32, i)), mload(add(add(tempResult, 32), mul(32, i)))) - } - } - } - - - // Helper function to create Blake2F input from parameters - function createBlake2FInput( - uint32 rounds, - bytes32[8] memory h, - bytes32[16] memory m, - bytes8[2] memory t, - bool f - ) public pure returns (bytes memory) { - // Start with rounds (4 bytes, big-endian) - bytes memory input = abi.encodePacked(rounds); - - // Add state vector h (8 * 32 = 256 bytes) - for (uint i = 0; i < 8; i++) { - input = abi.encodePacked(input, h[i]); - } - - // Add message block m (16 * 32 = 512 bytes, but we need to convert to 16 * 8 = 128 bytes) - // Blake2F expects 64-bit words in little-endian format - for (uint i = 0; i < 16; i++) { - // Take only the first 8 bytes of each bytes32 and reverse for little-endian - bytes8 word = bytes8(m[i]); - input = abi.encodePacked(input, word); - } - - // Add offset counters t (2 * 8 = 16 bytes) - input = abi.encodePacked(input, t[0], t[1]); - // Add final block flag (1 byte) - input = abi.encodePacked(input, f ? bytes1(0x01) : bytes1(0x00)); - - return input; - } - - // Simplified function that works with raw hex input - function blake2FFromHex(string memory hexInput) public { - bytes memory input = hexStringToBytes(hexInput); - blake2F(input); - } - - // Helper function to convert hex string to bytes - function hexStringToBytes(string memory hexString) public pure returns (bytes memory) { - bytes memory hexBytes = bytes(hexString); - require(hexBytes.length % 2 == 0, "Invalid hex string length"); - - bytes memory result = new bytes(hexBytes.length / 2); - - for (uint i = 0; i < hexBytes.length / 2; i++) { - result[i] = bytes1( - (hexCharToByte(hexBytes[2 * i]) << 4) | - hexCharToByte(hexBytes[2 * i + 1]) - ); - } - - return result; - } - - function hexCharToByte(bytes1 char) internal pure returns (uint8) { - uint8 c = uint8(char); - if (c >= 48 && c <= 57) return c - 48; // 0-9 - if (c >= 65 && c <= 70) return c - 55; // A-F - if (c >= 97 && c <= 102) return c - 87; // a-f - revert("Invalid hex character"); - } -} ``` To use it, deploy `Blake2FExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `callBlake2F` with the properly formatted input parameters for rounds, state vector, message block, offset counters, and final block flag. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Blake2.js){target=\_blank} demonstrates how to perform Blake2 compression with different rounds and verify the correctness of the output against known test vectors. diff --git a/.ai/pages/develop-toolkit-api-libraries.md b/.ai/pages/develop-toolkit-api-libraries.md index 1e84ff250..2e9c39647 100644 --- a/.ai/pages/develop-toolkit-api-libraries.md +++ b/.ai/pages/develop-toolkit-api-libraries.md @@ -18,14 +18,12 @@ Explore the powerful API libraries designed for interacting with the Polkadot ne diff --git a/.ai/pages/develop-toolkit-parachains-fork-chains-chopsticks.md b/.ai/pages/develop-toolkit-parachains-fork-chains-chopsticks.md index d08a5a711..ac175c009 100644 --- a/.ai/pages/develop-toolkit-parachains-fork-chains-chopsticks.md +++ b/.ai/pages/develop-toolkit-parachains-fork-chains-chopsticks.md @@ -28,14 +28,12 @@ Whether you're debugging an issue, testing new features, or exploring cross-chai diff --git a/.ai/pages/develop-toolkit-parachains-fork-chains.md b/.ai/pages/develop-toolkit-parachains-fork-chains.md index 5e1e5d70f..c9ad1ec94 100644 --- a/.ai/pages/develop-toolkit-parachains-fork-chains.md +++ b/.ai/pages/develop-toolkit-parachains-fork-chains.md @@ -29,7 +29,6 @@ Forking a live chain creates a controlled environment that mirrors live network diff --git a/.ai/pages/develop-toolkit-parachains-spawn-chains-zombienet.md b/.ai/pages/develop-toolkit-parachains-spawn-chains-zombienet.md index ecd4913e2..4df40939d 100644 --- a/.ai/pages/develop-toolkit-parachains-spawn-chains-zombienet.md +++ b/.ai/pages/develop-toolkit-parachains-spawn-chains-zombienet.md @@ -27,7 +27,6 @@ Whether you're building a new parachain or testing runtime upgrades, Zombienet p diff --git a/.ai/pages/develop-toolkit-parachains-spawn-chains.md b/.ai/pages/develop-toolkit-parachains-spawn-chains.md index 3cb1a73d6..83f364a14 100644 --- a/.ai/pages/develop-toolkit-parachains-spawn-chains.md +++ b/.ai/pages/develop-toolkit-parachains-spawn-chains.md @@ -29,7 +29,6 @@ Spawning a network provides a controlled environment to test and validate variou diff --git a/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding-start-validating.md b/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding-start-validating.md index 61652dd5d..a0bdcce8b 100644 --- a/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding-start-validating.md +++ b/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding-start-validating.md @@ -181,44 +181,7 @@ touch /etc/systemd/system/polkadot-validator.service In this unit file, you will write the commands that you want to run on server boot/restart: ```systemd title="/etc/systemd/system/polkadot-validator.service" -[Unit] -Description=Polkadot Node -After=network.target -Documentation=https://github.com/paritytech/polkadot-sdk - -[Service] -EnvironmentFile=-/etc/default/polkadot -ExecStart=/usr/bin/polkadot $POLKADOT_CLI_ARGS -User=polkadot -Group=polkadot -Restart=always -RestartSec=120 -CapabilityBoundingSet= -LockPersonality=true -NoNewPrivileges=true -PrivateDevices=true -PrivateMounts=true -PrivateTmp=true -PrivateUsers=true -ProtectClock=true -ProtectControlGroups=true -ProtectHostname=true -ProtectKernelModules=true -ProtectKernelTunables=true -ProtectSystem=strict -RemoveIPC=true -RestrictAddressFamilies=AF_INET AF_INET6 AF_NETLINK AF_UNIX -RestrictNamespaces=false -RestrictSUIDSGID=true -SystemCallArchitectures=native -SystemCallFilter=@system-service -SystemCallFilter=landlock_add_rule landlock_create_ruleset landlock_restrict_self seccomp mount umount2 -SystemCallFilter=~@clock @module @reboot @swap @privileged -SystemCallFilter=pivot_root -UMask=0027 - -[Install] -WantedBy=multi-user.target + ``` !!! warning "Restart delay and equivocation risk" diff --git a/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding.md b/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding.md index e2dd7bc3f..8575896b8 100644 --- a/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding.md +++ b/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding.md @@ -20,28 +20,24 @@ This section provides guidance on how to set up, activate, and deactivate your v diff --git a/.ai/pages/infrastructure-running-a-validator-operational-tasks.md b/.ai/pages/infrastructure-running-a-validator-operational-tasks.md index 2f1648250..f54ea10d1 100644 --- a/.ai/pages/infrastructure-running-a-validator-operational-tasks.md +++ b/.ai/pages/infrastructure-running-a-validator-operational-tasks.md @@ -18,14 +18,12 @@ Running a Polkadot validator node involves several key operational tasks to ensu diff --git a/.ai/pages/infrastructure-running-a-validator.md b/.ai/pages/infrastructure-running-a-validator.md index d3a6cd49f..09f3d534b 100644 --- a/.ai/pages/infrastructure-running-a-validator.md +++ b/.ai/pages/infrastructure-running-a-validator.md @@ -20,21 +20,18 @@ Learn the requirements for setting up a Polkadot validator node, along with deta diff --git a/.ai/pages/infrastructure-staking-mechanics.md b/.ai/pages/infrastructure-staking-mechanics.md index 2ecbb1440..f8c35fc33 100644 --- a/.ai/pages/infrastructure-staking-mechanics.md +++ b/.ai/pages/infrastructure-staking-mechanics.md @@ -18,21 +18,18 @@ Gain a deep understanding of the staking mechanics in Polkadot, with a focus on diff --git a/.ai/pages/polkadot-protocol-parachain-basics-accounts.md b/.ai/pages/polkadot-protocol-parachain-basics-accounts.md index 80d2f595f..f5e3bc727 100644 --- a/.ai/pages/polkadot-protocol-parachain-basics-accounts.md +++ b/.ai/pages/polkadot-protocol-parachain-basics-accounts.md @@ -24,16 +24,7 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - /// The full account information for a particular account ID. - #[pallet::storage] - #[pallet::getter(fn account)] - pub type Account = StorageMap< - _, - Blake2_128Concat, - T::AccountId, - AccountInfo, - ValueQuery, - >; + ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -57,24 +48,7 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs -/// Information of an account. -#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] -pub struct AccountInfo { - /// The number of transactions this account has sent. - pub nonce: Nonce, - /// The number of other modules that currently depend on this account's existence. The account - /// cannot be reaped until this is zero. - pub consumers: RefCount, - /// The number of other modules that allow this account to exist. The account may not be reaped - /// until this and `sufficients` are both zero. - pub providers: RefCount, - /// The number of modules that allow this account to exist for their own purposes only. The - /// account may not be reaped until this and `providers` are both zero. - pub sufficients: RefCount, - /// The additional data that belongs to this account. Used to store the balance(s) in a lot of - /// chains. - pub data: AccountData, -} + ``` The `AccountInfo` structure includes the following components: diff --git a/.ai/pages/tutorials-dapps.md b/.ai/pages/tutorials-dapps.md index 940a4746f..816de3147 100644 --- a/.ai/pages/tutorials-dapps.md +++ b/.ai/pages/tutorials-dapps.md @@ -20,14 +20,12 @@ You'll explore a range of topics—from client-side apps and CLI tools to on-cha diff --git a/.ai/pages/tutorials-interoperability-xcm-channels.md b/.ai/pages/tutorials-interoperability-xcm-channels.md index 3f1856040..d0e1948a3 100644 --- a/.ai/pages/tutorials-interoperability-xcm-channels.md +++ b/.ai/pages/tutorials-interoperability-xcm-channels.md @@ -26,7 +26,6 @@ To enable communication between parachains, explicit HRMP channels must be estab diff --git a/.ai/pages/tutorials-interoperability.md b/.ai/pages/tutorials-interoperability.md index 39acf65bd..5f4e222a2 100644 --- a/.ai/pages/tutorials-interoperability.md +++ b/.ai/pages/tutorials-interoperability.md @@ -31,14 +31,12 @@ Learn to establish and use cross-chain communication channels: diff --git a/.ai/pages/tutorials-onchain-governance.md b/.ai/pages/tutorials-onchain-governance.md index 0f5d97d54..55d20346a 100644 --- a/.ai/pages/tutorials-onchain-governance.md +++ b/.ai/pages/tutorials-onchain-governance.md @@ -20,7 +20,6 @@ This section provides step-by-step tutorials to help you navigate the technical diff --git a/.ai/pages/tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime.md b/.ai/pages/tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime.md index bcf37232a..08d62300c 100644 --- a/.ai/pages/tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime.md +++ b/.ai/pages/tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime.md @@ -20,7 +20,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - [dependencies] + ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -167,63 +167,13 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] + ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - [workspace.package] - license = "MIT-0" - authors = ["Parity Technologies "] - homepage = "https://paritytech.github.io/polkadot-sdk/" - repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" - edition = "2021" - - [workspace] - default-members = ["pallets/template", "runtime"] - members = [ - "node", "pallets/custom-pallet", - "pallets/template", - "runtime", - ] - resolver = "2" - - [workspace.dependencies] - parachain-template-runtime = { path = "./runtime", default-features = false } - pallet-parachain-template = { path = "./pallets/template", default-features = false } - clap = { version = "4.5.13" } - color-print = { version = "0.3.4" } - docify = { version = "0.2.9" } - futures = { version = "0.3.31" } - jsonrpsee = { version = "0.24.3" } - log = { version = "0.4.22", default-features = false } - polkadot-sdk = { version = "2503.0.1", default-features = false } - prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } - serde = { version = "1.0.214", default-features = false } - codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } - cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } - hex-literal = { version = "0.4.1", default-features = false } - scale-info = { version = "2.11.6", default-features = false } - serde_json = { version = "1.0.132", default-features = false } - smallvec = { version = "1.11.0", default-features = false } - substrate-wasm-builder = { version = "26.0.1", default-features = false } - frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } - - [profile.release] - opt-level = 3 - panic = "unwind" - - [profile.production] - codegen-units = 1 - inherits = "release" - lto = true + ``` diff --git a/.ai/pages/tutorials-polkadot-sdk-system-chains-asset-hub.md b/.ai/pages/tutorials-polkadot-sdk-system-chains-asset-hub.md index 8cfc67b67..ae1b50408 100644 --- a/.ai/pages/tutorials-polkadot-sdk-system-chains-asset-hub.md +++ b/.ai/pages/tutorials-polkadot-sdk-system-chains-asset-hub.md @@ -35,7 +35,6 @@ Through these tutorials, you'll learn how to manage cross-chain assets, includin diff --git a/.ai/pages/tutorials-polkadot-sdk.md b/.ai/pages/tutorials-polkadot-sdk.md index 9cc3bfb54..feb7470eb 100644 --- a/.ai/pages/tutorials-polkadot-sdk.md +++ b/.ai/pages/tutorials-polkadot-sdk.md @@ -28,7 +28,6 @@ Follow these key milestones to guide you through parachain development. Each ste diff --git a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-contracts.md b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-contracts.md index 48031eb9f..d44254595 100644 --- a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-contracts.md +++ b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-contracts.md @@ -94,53 +94,13 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - // SPDX-License-Identifier: MIT - pragma solidity ^0.8.28; - - contract Storage { - // State variable to store our number - uint256 private number; - - // Event to notify when the number changes - event NumberChanged(uint256 newNumber); - - // Function to store a new number - function store(uint256 newNumber) public { - number = newNumber; - emit NumberChanged(newNumber); - } - - // Function to retrieve the stored number - function retrieve() public view returns (uint256) { - return number; - } - } + ``` ## Understanding the Code diff --git a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js.md b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js.md index 961facefa..e17d3924d 100644 --- a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js.md +++ b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js.md @@ -77,31 +77,7 @@ npm install ethers@6.13.5 To interact with the Polkadot Hub, you need to set up an [Ethers.js Provider](/develop/smart-contracts/libraries/ethers-js/#set-up-the-ethersjs-provider){target=\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/ethers.js` and add the following code: ```javascript title="app/utils/ethers.js" -import { JsonRpcProvider } from 'ethers'; - -export const PASSET_HUB_CONFIG = { - name: 'Passet Hub', - rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io/', // Passet Hub testnet RPC - chainId: 420420422, // Passet Hub testnet chainId - blockExplorer: 'https://blockscout-passet-hub.parity-testnet.parity.io/', -}; - -export const getProvider = () => { - return new JsonRpcProvider(PASSET_HUB_CONFIG.rpc, { - chainId: PASSET_HUB_CONFIG.chainId, - name: PASSET_HUB_CONFIG.name, - }); -}; - -// Helper to get a signer from a provider -export const getSigner = async (provider) => { - if (window.ethereum) { - await window.ethereum.request({ method: 'eth_requestAccounts' }); - const ethersProvider = new ethers.BrowserProvider(window.ethereum); - return ethersProvider.getSigner(); - } - throw new Error('No Ethereum browser provider detected'); -}; + ``` This file establishes a connection to the Polkadot Hub TestNet and provides helper functions for obtaining a [Provider](https://docs.ethers.org/v5/api/providers/provider/){target=_blank} and [Signer](https://docs.ethers.org/v5/api/signer/){target=_blank}. The provider allows you to read data from the blockchain, while the signer enables users to send transactions and modify the blockchain state. @@ -113,55 +89,13 @@ For this dApp, you'll use a simple Storage contract already deployed. So, you ne ???+ code "Storage.sol ABI" ```json title="abis/Storage.json" - [ - { - "inputs": [ - { - "internalType": "uint256", - "name": "_newNumber", - "type": "uint256" - } - ], - "name": "setNumber", - "outputs": [], - "stateMutability": "nonpayable", - "type": "function" - }, - { - "inputs": [], - "name": "storedNumber", - "outputs": [ - { - "internalType": "uint256", - "name": "", - "type": "uint256" - } - ], - "stateMutability": "view", - "type": "function" - } - ] + ``` Now, create a file called `app/utils/contract.js`: ```javascript title="app/utils/contract.js" -import { Contract } from 'ethers'; -import { getProvider } from './ethers'; -import StorageABI from '../../abis/Storage.json'; - -export const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f'; - -export const CONTRACT_ABI = StorageABI; - -export const getContract = () => { - const provider = getProvider(); - return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, provider); -}; -export const getSignedContract = async (signer) => { - return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, signer); -}; ``` This file defines the contract address, ABI, and functions to create instances of the contract for reading and writing. @@ -171,167 +105,7 @@ This file defines the contract address, ABI, and functions to create instances o Next, let's create a component to handle wallet connections. Create a new file called `app/components/WalletConnect.js`: ```javascript title="app/components/WalletConnect.js" -'use client'; - -import React, { useState, useEffect } from 'react'; -import { PASSET_HUB_CONFIG } from '../utils/ethers'; - -const WalletConnect = ({ onConnect }) => { - const [account, setAccount] = useState(null); - const [chainId, setChainId] = useState(null); - const [error, setError] = useState(null); - - useEffect(() => { - // Check if user already has an authorized wallet connection - const checkConnection = async () => { - if (window.ethereum) { - try { - // eth_accounts doesn't trigger the wallet popup - const accounts = await window.ethereum.request({ - method: 'eth_accounts', - }); - if (accounts.length > 0) { - setAccount(accounts[0]); - const chainIdHex = await window.ethereum.request({ - method: 'eth_chainId', - }); - setChainId(parseInt(chainIdHex, 16)); - } - } catch (err) { - console.error('Error checking connection:', err); - setError('Failed to check wallet connection'); - } - } - }; - - checkConnection(); - - if (window.ethereum) { - // Setup wallet event listeners - window.ethereum.on('accountsChanged', (accounts) => { - setAccount(accounts[0] || null); - if (accounts[0] && onConnect) onConnect(accounts[0]); - }); - - window.ethereum.on('chainChanged', (chainIdHex) => { - setChainId(parseInt(chainIdHex, 16)); - }); - } - - return () => { - // Cleanup event listeners - if (window.ethereum) { - window.ethereum.removeListener('accountsChanged', () => {}); - window.ethereum.removeListener('chainChanged', () => {}); - } - }; - }, [onConnect]); - - const connectWallet = async () => { - if (!window.ethereum) { - setError( - 'MetaMask not detected! Please install MetaMask to use this dApp.' - ); - return; - } - - try { - // eth_requestAccounts triggers the wallet popup - const accounts = await window.ethereum.request({ - method: 'eth_requestAccounts', - }); - setAccount(accounts[0]); - - const chainIdHex = await window.ethereum.request({ - method: 'eth_chainId', - }); - const currentChainId = parseInt(chainIdHex, 16); - setChainId(currentChainId); - - // Prompt user to switch networks if needed - if (currentChainId !== PASSET_HUB_CONFIG.chainId) { - await switchNetwork(); - } - - if (onConnect) onConnect(accounts[0]); - } catch (err) { - console.error('Error connecting to wallet:', err); - setError('Failed to connect wallet'); - } - }; - - const switchNetwork = async () => { - try { - await window.ethereum.request({ - method: 'wallet_switchEthereumChain', - params: [{ chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}` }], - }); - } catch (switchError) { - // Error 4902 means the chain hasn't been added to MetaMask - if (switchError.code === 4902) { - try { - await window.ethereum.request({ - method: 'wallet_addEthereumChain', - params: [ - { - chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}`, - chainName: PASSET_HUB_CONFIG.name, - rpcUrls: [PASSET_HUB_CONFIG.rpc], - blockExplorerUrls: [PASSET_HUB_CONFIG.blockExplorer], - }, - ], - }); - } catch (addError) { - setError('Failed to add network to wallet'); - } - } else { - setError('Failed to switch network'); - } - } - }; - - // UI-only disconnection - MetaMask doesn't support programmatic disconnection - const disconnectWallet = () => { - setAccount(null); - }; - - return ( -
- {error &&

{error}

} - - {!account ? ( - - ) : ( -
- - {`${account.substring(0, 6)}...${account.substring(38)}`} - - - {chainId !== PASSET_HUB_CONFIG.chainId && ( - - )} -
- )} -
- ); -}; - -export default WalletConnect; + ``` This component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. @@ -340,25 +114,9 @@ To integrate this component to your dApp, you need to overwrite the existing boi ```javascript title="app/page.js" -import { useState } from 'react'; - -import WalletConnect from './components/WalletConnect'; -export default function Home() { - const [account, setAccount] = useState(null); - - const handleConnect = (connectedAccount) => { - setAccount(connectedAccount); - }; - - return ( -
-

- Ethers.js dApp - Passet Hub Smart Contracts -

- -
- ); -} + + + ``` In your terminal, you can launch your project by running: @@ -376,64 +134,7 @@ And you will see the following: Now, let's create a component to read data from the contract. Create a file called `app/components/ReadContract.js`: ```javascript title="app/components/ReadContract.js" -'use client'; - -import React, { useState, useEffect } from 'react'; -import { getContract } from '../utils/contract'; - -const ReadContract = () => { - const [storedNumber, setStoredNumber] = useState(null); - const [loading, setLoading] = useState(true); - const [error, setError] = useState(null); - - useEffect(() => { - // Function to read data from the blockchain - const fetchData = async () => { - try { - setLoading(true); - const contract = getContract(); - // Call the smart contract's storedNumber function - const number = await contract.storedNumber(); - setStoredNumber(number.toString()); - setError(null); - } catch (err) { - console.error('Error fetching stored number:', err); - setError('Failed to fetch data from the contract'); - } finally { - setLoading(false); - } - }; - - fetchData(); - - // Poll for updates every 10 seconds to keep UI in sync with blockchain - const interval = setInterval(fetchData, 10000); - - // Clean up interval on component unmount - return () => clearInterval(interval); - }, []); - - return ( -
-

Contract Data

- {loading ? ( -
-
-
- ) : error ? ( -

{error}

- ) : ( -
-

- Stored Number: {storedNumber} -

-
- )} -
- ); -}; - -export default ReadContract; + ``` This component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically. @@ -442,27 +143,9 @@ To see this change in your dApp, you need to integrate this component into the ` ```javascript title="app/page.js" -import { useState } from 'react'; - -import WalletConnect from './components/WalletConnect'; -import ReadContract from './components/ReadContract'; -export default function Home() { - const [account, setAccount] = useState(null); - - const handleConnect = (connectedAccount) => { - setAccount(connectedAccount); - }; - - return ( -
-

- Ethers.js dApp - Passet Hub Smart Contracts -

- - -
- ); -} + + + ``` Your dApp will automatically be updated to the following: @@ -474,119 +157,7 @@ Your dApp will automatically be updated to the following: Finally, let's create a component that allows users to update the stored number. Create a file called `app/components/WriteContract.js`: ```javascript title="app/components/WriteContract.js" -'use client'; - -import { useState } from 'react'; -import { getSignedContract } from '../utils/contract'; -import { ethers } from 'ethers'; - -const WriteContract = ({ account }) => { - const [newNumber, setNewNumber] = useState(''); - const [status, setStatus] = useState({ type: null, message: '' }); - const [isSubmitting, setIsSubmitting] = useState(false); - - const handleSubmit = async (e) => { - e.preventDefault(); - - // Validation checks - if (!account) { - setStatus({ type: 'error', message: 'Please connect your wallet first' }); - return; - } - - if (!newNumber || isNaN(Number(newNumber))) { - setStatus({ type: 'error', message: 'Please enter a valid number' }); - return; - } - - try { - setIsSubmitting(true); - setStatus({ type: 'info', message: 'Initiating transaction...' }); - - // Get a signer from the connected wallet - const provider = new ethers.BrowserProvider(window.ethereum); - const signer = await provider.getSigner(); - const contract = await getSignedContract(signer); - - // Send transaction to blockchain and wait for user confirmation in wallet - setStatus({ - type: 'info', - message: 'Please confirm the transaction in your wallet...', - }); - - // Call the contract's setNumber function - const tx = await contract.setNumber(newNumber); - - // Wait for transaction to be mined - setStatus({ - type: 'info', - message: 'Transaction submitted. Waiting for confirmation...', - }); - const receipt = await tx.wait(); - - setStatus({ - type: 'success', - message: `Transaction confirmed! Transaction hash: ${receipt.hash}`, - }); - setNewNumber(''); - } catch (err) { - console.error('Error updating number:', err); - - // Error code 4001 is MetaMask's code for user rejection - if (err.code === 4001) { - setStatus({ type: 'error', message: 'Transaction rejected by user.' }); - } else { - setStatus({ - type: 'error', - message: `Error: ${err.message || 'Failed to send transaction'}`, - }); - } - } finally { - setIsSubmitting(false); - } - }; - - return ( -
-

Update Stored Number

- {status.message && ( -
- {status.message} -
- )} -
- setNewNumber(e.target.value)} - disabled={isSubmitting || !account} - className="w-full p-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-pink-400" - /> - -
- {!account && ( -

- Connect your wallet to update the stored number. -

- )} -
- ); -}; - -export default WriteContract; + ``` This component allows users to input a new number and send a transaction to update the value stored in the contract. When the transaction is successful, users will see the stored value update in the `ReadContract` component after the transaction is confirmed. @@ -594,32 +165,7 @@ This component allows users to input a new number and send a transaction to upda Update the `app/page.js` file to integrate all components: ```javascript title="app/page.js" -'use client'; - -import { useState } from 'react'; - -import WalletConnect from './components/WalletConnect'; -import ReadContract from './components/ReadContract'; -import WriteContract from './components/WriteContract'; - -export default function Home() { - const [account, setAccount] = useState(null); - - const handleConnect = (connectedAccount) => { - setAccount(connectedAccount); - }; - - return ( -
-

- Ethers.js dApp - Passet Hub Smart Contracts -

- - - -
- ); -} + ``` The completed UI will display: diff --git a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-viem.md b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-viem.md index ba55872d5..a208400f4 100644 --- a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-viem.md +++ b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-viem.md @@ -77,47 +77,7 @@ npm install --save-dev typescript @types/node To interact with Polkadot Hub, you need to set up a [Public Client](https://viem.sh/docs/clients/public#public-client){target=\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/viem.ts` and add the following code: ```typescript title="viem.ts" -import { createPublicClient, http, createWalletClient, custom } from 'viem' -import 'viem/window'; - - -const transport = http('https://testnet-passet-hub-eth-rpc.polkadot.io') - -// Configure the Passet Hub chain -export const passetHub = { - id: 420420422, - name: 'Passet Hub', - network: 'passet-hub', - nativeCurrency: { - decimals: 18, - name: 'PAS', - symbol: 'PAS', - }, - rpcUrls: { - default: { - http: ['https://testnet-passet-hub-eth-rpc.polkadot.io'], - }, - }, -} as const - -// Create a public client for reading data -export const publicClient = createPublicClient({ - chain: passetHub, - transport -}) - -// Create a wallet client for signing transactions -export const getWalletClient = async () => { - if (typeof window !== 'undefined' && window.ethereum) { - const [account] = await window.ethereum.request({ method: 'eth_requestAccounts' }); - return createWalletClient({ - chain: passetHub, - transport: custom(window.ethereum), - account, - }); - } - throw new Error('No Ethereum browser provider detected'); -}; + ``` This file initializes a viem client, providing helper functions for obtaining a Public Client and a [Wallet Client](https://viem.sh/docs/clients/wallet#wallet-client){target=\_blank}. The Public Client enables reading blockchain data, while the Wallet Client allows users to sign and send transactions. Also, note that by importing `'viem/window'` the global `window.ethereum` will be typed as an `EIP1193Provider`, check the [`window` Polyfill](https://viem.sh/docs/typescript#window-polyfill){target=\_blank} reference for more information. @@ -163,31 +123,7 @@ Create a folder called `abis` at the root of your project, then create a file na Next, create a file called `utils/contract.ts`: ```typescript title="contract.ts" -import { getContract } from 'viem'; -import { publicClient, getWalletClient } from './viem'; -import StorageABI from '../../abis/Storage.json'; - -export const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f'; -export const CONTRACT_ABI = StorageABI; - -// Create a function to get a contract instance for reading -export const getContractInstance = () => { - return getContract({ - address: CONTRACT_ADDRESS, - abi: CONTRACT_ABI, - client: publicClient, - }); -}; -// Create a function to get a contract instance with a signer for writing -export const getSignedContract = async () => { - const walletClient = await getWalletClient(); - return getContract({ - address: CONTRACT_ADDRESS, - abi: CONTRACT_ABI, - client: walletClient, - }); -}; ``` This file defines the contract address, ABI, and functions to create a viem [contract instance](https://viem.sh/docs/contract/getContract#contract-instances){target=\_blank} for reading and writing operations. viem's contract utilities ensure a more efficient and type-safe interaction with smart contracts. @@ -197,180 +133,7 @@ This file defines the contract address, ABI, and functions to create a viem [con Now, let's create a component to handle wallet connections. Create a new file called `components/WalletConnect.tsx`: ```typescript title="WalletConnect.tsx" -"use client"; - -import React, { useState, useEffect } from "react"; -import { passetHub } from "../utils/viem"; - -interface WalletConnectProps { - onConnect: (account: string) => void; -} - -const WalletConnect: React.FC = ({ onConnect }) => { - const [account, setAccount] = useState(null); - const [chainId, setChainId] = useState(null); - const [error, setError] = useState(null); - - useEffect(() => { - // Check if user already has an authorized wallet connection - const checkConnection = async () => { - if (typeof window !== 'undefined' && window.ethereum) { - try { - // eth_accounts doesn't trigger the wallet popup - const accounts = await window.ethereum.request({ - method: 'eth_accounts', - }) as string[]; - - if (accounts.length > 0) { - setAccount(accounts[0]); - const chainIdHex = await window.ethereum.request({ - method: 'eth_chainId', - }) as string; - setChainId(parseInt(chainIdHex, 16)); - onConnect(accounts[0]); - } - } catch (err) { - console.error('Error checking connection:', err); - setError('Failed to check wallet connection'); - } - } - }; - - checkConnection(); - - if (typeof window !== 'undefined' && window.ethereum) { - // Setup wallet event listeners - window.ethereum.on('accountsChanged', (accounts: string[]) => { - setAccount(accounts[0] || null); - if (accounts[0]) onConnect(accounts[0]); - }); - - window.ethereum.on('chainChanged', (chainIdHex: string) => { - setChainId(parseInt(chainIdHex, 16)); - }); - } - return () => { - // Cleanup event listeners - if (typeof window !== 'undefined' && window.ethereum) { - window.ethereum.removeListener('accountsChanged', () => {}); - window.ethereum.removeListener('chainChanged', () => {}); - } - }; - }, [onConnect]); - - const connectWallet = async () => { - if (typeof window === 'undefined' || !window.ethereum) { - setError( - 'MetaMask not detected! Please install MetaMask to use this dApp.' - ); - return; - } - - try { - // eth_requestAccounts triggers the wallet popup - const accounts = await window.ethereum.request({ - method: 'eth_requestAccounts', - }) as string[]; - - setAccount(accounts[0]); - - const chainIdHex = await window.ethereum.request({ - method: 'eth_chainId', - }) as string; - - const currentChainId = parseInt(chainIdHex, 16); - setChainId(currentChainId); - - // Prompt user to switch networks if needed - if (currentChainId !== passetHub.id) { - await switchNetwork(); - } - - onConnect(accounts[0]); - } catch (err) { - console.error('Error connecting to wallet:', err); - setError('Failed to connect wallet'); - } - }; - - const switchNetwork = async () => { - console.log('Switch network') - try { - await window.ethereum.request({ - method: 'wallet_switchEthereumChain', - params: [{ chainId: `0x${passetHub.id.toString(16)}` }], - }); - } catch (switchError: any) { - // Error 4902 means the chain hasn't been added to MetaMask - if (switchError.code === 4902) { - try { - await window.ethereum.request({ - method: 'wallet_addEthereumChain', - params: [ - { - chainId: `0x${passetHub.id.toString(16)}`, - chainName: passetHub.name, - rpcUrls: [passetHub.rpcUrls.default.http[0]], - nativeCurrency: { - name: passetHub.nativeCurrency.name, - symbol: passetHub.nativeCurrency.symbol, - decimals: passetHub.nativeCurrency.decimals, - }, - }, - ], - }); - } catch (addError) { - setError('Failed to add network to wallet'); - } - } else { - setError('Failed to switch network'); - } - } - }; - - // UI-only disconnection - MetaMask doesn't support programmatic disconnection - const disconnectWallet = () => { - setAccount(null); - }; - - return ( -
- {error &&

{error}

} - - {!account ? ( - - ) : ( -
- - {`${account.substring(0, 6)}...${account.substring(38)}`} - - - {chainId !== passetHub.id && ( - - )} -
- )} -
- ); -}; - -export default WalletConnect; ``` This component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. It provides a button for users to connect their wallet and displays the connected account address once connected. @@ -379,24 +142,9 @@ To use this component in your dApp, replace the existing boilerplate in `app/pag ```typescript title="page.tsx" -import { useState } from "react"; -import WalletConnect from "./components/WalletConnect"; -export default function Home() { - const [account, setAccount] = useState(null); - const handleConnect = (connectedAccount: string) => { - setAccount(connectedAccount); - }; - return ( -
-

- Viem dApp - Passet Hub Smart Contracts -

- -
- ); -} + ``` Now you're ready to run your dApp. From your project directory, execute: @@ -414,70 +162,7 @@ Navigate to `http://localhost:3000` in your browser, and you should see your dAp Now, let's create a component to read data from the contract. Create a file called `components/ReadContract.tsx`: ```typescript title="ReadContract.tsx" -'use client'; - -import React, { useState, useEffect } from 'react'; -import { publicClient } from '../utils/viem'; -import { CONTRACT_ADDRESS, CONTRACT_ABI } from '../utils/contract'; - -const ReadContract: React.FC = () => { - const [storedNumber, setStoredNumber] = useState(null); - const [loading, setLoading] = useState(true); - const [error, setError] = useState(null); - - useEffect(() => { - // Function to read data from the blockchain - const fetchData = async () => { - try { - setLoading(true); - // Call the smart contract's storedNumber function - const number = await publicClient.readContract({ - address: CONTRACT_ADDRESS, - abi: CONTRACT_ABI, - functionName: 'storedNumber', - args: [], - }) as bigint; - - setStoredNumber(number.toString()); - setError(null); - } catch (err) { - console.error('Error fetching stored number:', err); - setError('Failed to fetch data from the contract'); - } finally { - setLoading(false); - } - }; - - fetchData(); - // Poll for updates every 10 seconds to keep UI in sync with blockchain - const interval = setInterval(fetchData, 10000); - - // Clean up interval on component unmount - return () => clearInterval(interval); - }, []); - - return ( -
-

Contract Data

- {loading ? ( -
-
-
- ) : error ? ( -

{error}

- ) : ( -
-

- Stored Number: {storedNumber} -

-
- )} -
- ); -}; - -export default ReadContract; ``` This component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically, ensuring that the UI stays in sync with the blockchain state. @@ -486,26 +171,9 @@ To reflect this change in your dApp, incorporate this component into the `app/pa ```typescript title="page.tsx" -import { useState } from "react"; -import WalletConnect from "./components/WalletConnect"; -import ReadContract from "./components/ReadContract"; -export default function Home() { - const [account, setAccount] = useState(null); - const handleConnect = (connectedAccount: string) => { - setAccount(connectedAccount); - }; - return ( -
-

- Viem dApp - Passet Hub Smart Contracts -

- - -
- ); -} + ``` And you will see in your browser: @@ -741,31 +409,7 @@ This component allows users to input a new number and send a transaction to upda Update the `app/page.tsx` file to integrate all components: ```typescript title="page.tsx" -"use client"; - -import { useState } from "react"; -import WalletConnect from "./components/WalletConnect"; -import ReadContract from "./components/ReadContract"; -import WriteContract from "./components/WriteContract"; - -export default function Home() { - const [account, setAccount] = useState(null); - const handleConnect = (connectedAccount: string) => { - setAccount(connectedAccount); - }; - - return ( -
-

- Viem dApp - Passet Hub Smart Contracts -

- - - -
- ); -} ``` After that, you will see: diff --git a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat.md b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat.md index db839e043..b8b568f72 100644 --- a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat.md +++ b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat.md @@ -62,42 +62,7 @@ Let's start by setting up Hardhat for your Storage contract project: 6. Configure Hardhat by updating the `hardhat.config.js` file: ```javascript title="hardhat.config.js" - require("@nomicfoundation/hardhat-toolbox"); - - require("@parity/hardhat-polkadot"); - - const { vars } = require("hardhat/config"); - - /** @type import('hardhat/config').HardhatUserConfig */ - module.exports = { - solidity: "0.8.28", - resolc: { - compilerSource: "npm", - }, - networks: { - hardhat: { - polkavm: true, - nodeConfig: { - nodeBinaryPath: 'INSERT_PATH_TO_SUBSTRATE_NODE', - rpcPort: 8000, - dev: true, - }, - adapterConfig: { - adapterBinaryPath: 'INSERT_PATH_TO_ETH_RPC_ADAPTER', - dev: true, - }, - }, - localNode: { - polkavm: true, - url: `http://127.0.0.1:8545`, - }, - passetHub: { - polkavm: true, - url: 'https://testnet-passet-hub-eth-rpc.polkadot.io', - accounts: [vars.get("PRIVATE_KEY")], - }, - }, - }; + ``` Ensure that `INSERT_PATH_TO_SUBSTRATE_NODE` and `INSERT_PATH_TO_ETH_RPC_ADAPTER` are replaced with the proper paths to the compiled binaries. @@ -351,13 +316,7 @@ Testing is a critical part of smart contract development. Hardhat makes it easy 1. Create a new folder called`ignition/modules`. Add a new file named `StorageModule.js` with the following logic: ```javascript title="StorageModule.js" - const { buildModule } = require('@nomicfoundation/hardhat-ignition/modules'); - - module.exports = buildModule('StorageModule', (m) => { - const storage = m.contract('Storage'); - - return { storage }; - }); + ``` 2. Deploy to the local network: diff --git a/.ai/pages/tutorials.md b/.ai/pages/tutorials.md index a97c290d5..fefe297bf 100644 --- a/.ai/pages/tutorials.md +++ b/.ai/pages/tutorials.md @@ -20,7 +20,6 @@ The Zero to Hero series offers step-by-step guidance to development across the P @@ -32,28 +31,24 @@ The Zero to Hero series offers step-by-step guidance to development across the P diff --git a/.ai/site-index.json b/.ai/site-index.json index b17efce64..7129eaa28 100644 --- a/.ai/site-index.json +++ b/.ai/site-index.json @@ -54,7 +54,7 @@ "estimated_token_count_total": 1505 }, "hash": "sha256:2b017d8a89f8734b9cbb501f03612a22657d2f8d4d85c51e490e4c8ca4bf771b", - "last_modified": "2025-10-23T16:29:21+00:00", + "last_modified": "2025-10-24T21:11:54+00:00", "token_estimator": "heuristic-v1" }, { @@ -106,13 +106,13 @@ } ], "stats": { - "chars": 7146, - "words": 978, + "chars": 7032, + "words": 970, "headings": 7, - "estimated_token_count_total": 1635 + "estimated_token_count_total": 1620 }, - "hash": "sha256:46252e238b0b51105148dc622da6d8809c55ec11da7ec7b2953c35ca52f5f585", - "last_modified": "2025-10-23T16:29:22+00:00", + "hash": "sha256:c31f4cc6f58644a6a03957d32e74e65f30e6b4f5214416a9e379de64144a0833", + "last_modified": "2025-10-24T21:11:54+00:00", "token_estimator": "heuristic-v1" }, { @@ -149,13 +149,13 @@ } ], "stats": { - "chars": 7729, - "words": 836, + "chars": 5974, + "words": 747, "headings": 4, - "estimated_token_count_total": 1487 + "estimated_token_count_total": 1271 }, - "hash": "sha256:b07cb65636f24dbff99f21a5c6e4ac047e6455a879c3f9bbf692514fc24da17b", - "last_modified": "2025-10-23T16:29:23+00:00", + "hash": "sha256:84645b90dc0da9db8e83de27cf4c30bfff320498e249f43d6845b782a0dcd082", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -222,7 +222,7 @@ "estimated_token_count_total": 955 }, "hash": "sha256:72ee7394fd1308c111a8d548cb4dc63c6b9bc5b6e2bb556dd1baacbaedb92286", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -274,7 +274,7 @@ "estimated_token_count_total": 876 }, "hash": "sha256:d6cb22337280a19bdf24981dcba98f337d48ee4f79ce7ac040466ef1cb4b330b", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -356,7 +356,7 @@ "estimated_token_count_total": 2744 }, "hash": "sha256:1a2d34ccab19bd71263763bbc294977acf34f5800398f51398753594cfc7d7a6", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -428,7 +428,7 @@ "estimated_token_count_total": 608 }, "hash": "sha256:7bba6105d99721373aa6f494627d20af97b1851c19703f26be26c32f0c83524b", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -495,7 +495,7 @@ "estimated_token_count_total": 558 }, "hash": "sha256:b79fe56c9604712825bdf30d17667fd8f237fce9691be0d8d042d38691dbba7a", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -547,7 +547,7 @@ "estimated_token_count_total": 348 }, "hash": "sha256:11cd8d428fa9c3e70490da5c63ce4597cd89ec46306d2bb49b016ced6aa68c3d", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -574,7 +574,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:3821c2ef97699091b76e1de58e6d95e866df69d39fca16f2a15c156b71da5b22", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -601,7 +601,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:634e299f347beb8ad690697943bb7f99915d62d40cda4227179619ed18abe2ff", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -649,7 +649,7 @@ "estimated_token_count_total": 1373 }, "hash": "sha256:fc85c27ad58c1ca6d0e1fcded4b8e2b6e3d0e888ed4aa99158e21a5e799f5e6b", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -702,7 +702,7 @@ "estimated_token_count_total": 4979 }, "hash": "sha256:ed3b7c8101b69f9c907cca7c5edfef67fdb5e7bc3c8df8d9fbad297f9dd3c80a", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -759,7 +759,7 @@ "estimated_token_count_total": 1781 }, "hash": "sha256:35c71a215558cd0642d363e4515ad240093995d42720e6495cd2994c859243e4", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -806,7 +806,7 @@ "estimated_token_count_total": 1449 }, "hash": "sha256:346061a3b851699f815068b42a949f7a2259f6ece083c97cf35538cb7bd4e547", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -853,7 +853,7 @@ "estimated_token_count_total": 1082 }, "hash": "sha256:ec82957c768c2c07a272e7a28659c812b223df836e21372b1642f0bb249d7b39", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -895,7 +895,7 @@ "estimated_token_count_total": 4182 }, "hash": "sha256:25a2c4b5830df38e0aacec94d288179064742759e7df31fcb9905ad405e78fc3", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -922,7 +922,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:63584f5b1dab7b67b18b35b47dfc19d00ad5c013804772f0d653a11ac3fca38d", - "last_modified": "2025-10-23T16:29:23+00:00", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -948,13 +948,13 @@ } ], "stats": { - "chars": 1663, - "words": 215, + "chars": 1619, + "words": 211, "headings": 2, - "estimated_token_count_total": 340 + "estimated_token_count_total": 328 }, - "hash": "sha256:d4c2d7fd46ddf60f638f948c88ba3940de6d69f140923ba8df52ed787b0afede", - "last_modified": "2025-10-23T16:29:23+00:00", + "hash": "sha256:30ffcc12fff151fd0fa1baedfa803ecbb15106504df99c5a032ca173fffe0eca", + "last_modified": "2025-10-24T21:11:55+00:00", "token_estimator": "heuristic-v1" }, { @@ -1016,13 +1016,13 @@ } ], "stats": { - "chars": 33184, - "words": 3164, + "chars": 32843, + "words": 3133, "headings": 9, - "estimated_token_count_total": 6512 + "estimated_token_count_total": 6440 }, - "hash": "sha256:43a4a5832611e49024022c1e9e825742919017f959036bbcb82e82622d0daa18", - "last_modified": "2025-10-23T16:29:24+00:00", + "hash": "sha256:4d86accdf9d31b7763b22ab53b78ea3008d37b634ee7e454ff8e9adbd0876698", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1048,13 +1048,13 @@ } ], "stats": { - "chars": 2375, - "words": 326, + "chars": 2331, + "words": 322, "headings": 2, - "estimated_token_count_total": 402 + "estimated_token_count_total": 390 }, - "hash": "sha256:addb6b5bc11163daf47aef933fe23c7d1dcbc20fa4529090b3bbedf49bde4aa5", - "last_modified": "2025-10-23T16:29:21+00:00", + "hash": "sha256:3ca63851d29942ed00d3f143930edd1248a6321d1d8f6473ae697b72b0b9116e", + "last_modified": "2025-10-24T21:11:54+00:00", "token_estimator": "heuristic-v1" }, { @@ -1117,7 +1117,7 @@ "estimated_token_count_total": 1520 }, "hash": "sha256:ed09ef7a6abe21204006186fd5791ada7597688fad67e30244dc449c51330309", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1179,7 +1179,7 @@ "estimated_token_count_total": 2598 }, "hash": "sha256:b2b3d8c048863e7760f633b12ab2a0202c741be3050ea4beafb9a7265cfe96b5", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1236,7 +1236,7 @@ "estimated_token_count_total": 1219 }, "hash": "sha256:262e7a3ad3d0a0102897c52c7589e3f94c7827c441398b3b446b205f6c6753d3", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1278,7 +1278,7 @@ "estimated_token_count_total": 905 }, "hash": "sha256:ad8e6d9c77d5451c5f4d17f8e6311b21e6ad24eae8780fd4c3ae6013744822cf", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1345,7 +1345,7 @@ "estimated_token_count_total": 3995 }, "hash": "sha256:19997d390abf2847824024ba923f46a61106ef77544d256d50b371210816b309", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1413,7 +1413,7 @@ "estimated_token_count_total": 2021 }, "hash": "sha256:d253314c3db3e631a43137fbc9756eac3143c86c49a3d7a6c109f070f384ef84", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1439,13 +1439,13 @@ } ], "stats": { - "chars": 1899, - "words": 259, + "chars": 1866, + "words": 256, "headings": 2, - "estimated_token_count_total": 335 + "estimated_token_count_total": 326 }, - "hash": "sha256:9a08b66442c564c7116c686d8914b74ad617326f450d0894b05e753462f69aac", - "last_modified": "2025-10-23T16:29:24+00:00", + "hash": "sha256:705127e925f797216ab35ca7d0b4bb4fe56ee3c252318d35678e2d5f330a6571", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1506,13 +1506,13 @@ } ], "stats": { - "chars": 8470, - "words": 1227, + "chars": 7622, + "words": 1153, "headings": 9, - "estimated_token_count_total": 1944 + "estimated_token_count_total": 1771 }, - "hash": "sha256:4fc8cab40e982e860b64d9aede1058fe7fa82ec321ac215b919db00c4df0a9c0", - "last_modified": "2025-10-23T16:29:24+00:00", + "hash": "sha256:0f8b6191da1d8ed5c569081d33e8ae821c76c8272aefc5009ddc1754100d45a0", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1584,7 +1584,7 @@ "estimated_token_count_total": 3072 }, "hash": "sha256:ea36f84c753f4671c27d2d5ad1f785ddadb6b333a7436208dc9b61d5b079cf21", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1666,7 +1666,7 @@ "estimated_token_count_total": 3023 }, "hash": "sha256:f89b54fce05c6e26b58bc8a38694953422faf4a3559799a7d2f70dcfd6176304", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1718,7 +1718,7 @@ "estimated_token_count_total": 1287 }, "hash": "sha256:9686bce57413e86675e88ef7a2ce1e1f70226d10c7df8125f3c2bc7f729fcedd", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1770,7 +1770,7 @@ "estimated_token_count_total": 744 }, "hash": "sha256:358ed14147b96b47deb61df9a1ea0e1103a139ea5edb78c5d50a48d5a779b80d", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1801,13 +1801,13 @@ } ], "stats": { - "chars": 4806, - "words": 655, + "chars": 4795, + "words": 654, "headings": 3, - "estimated_token_count_total": 960 + "estimated_token_count_total": 957 }, - "hash": "sha256:32ff8711945e175aa7d073821a38320588997365d65b84a34aa36734066fc898", - "last_modified": "2025-10-23T16:29:24+00:00", + "hash": "sha256:1acbec60b62ffd3359fa04d224e8be0154d6d115a65b2e33d119905d382a7f17", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1890,7 +1890,7 @@ "estimated_token_count_total": 2709 }, "hash": "sha256:2ee5656f749b4bca445172f2bc66c7fc39af40ff173626662ae4c399f49cf909", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -1948,7 +1948,7 @@ "estimated_token_count_total": 1892 }, "hash": "sha256:74de798c287cae75729e7db54019507f03a361dbbd1f2bb58c4694605f83efab", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -2000,7 +2000,7 @@ "estimated_token_count_total": 4725 }, "hash": "sha256:b17e06e9e6bced8db89c193fac16c297b7f263c4a6613bf290b162f0f651ddb6", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -2062,7 +2062,7 @@ "estimated_token_count_total": 1819 }, "hash": "sha256:b0c1535fa8e969a9bdeee426a5a35a42b4649121fb8ce6fd2b15fdeba35b5d5f", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -2109,7 +2109,7 @@ "estimated_token_count_total": 1161 }, "hash": "sha256:07e63e1e99b9acf1cc3b5ef8fa1f06ff22182b2a801582ce800eba37d7d39408", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -2175,13 +2175,13 @@ } ], "stats": { - "chars": 18500, - "words": 2363, + "chars": 14162, + "words": 1849, "headings": 10, - "estimated_token_count_total": 4014 + "estimated_token_count_total": 2936 }, - "hash": "sha256:55dc252fdecf1590048ce8d009b822e90231442abe81e9593cf1635944a31336", - "last_modified": "2025-10-23T16:29:25+00:00", + "hash": "sha256:59387d27cb1f775bdb575cd61e9168c63ebc93e0afe020e93ea6fce3d2f61f5b", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -2233,7 +2233,7 @@ "estimated_token_count_total": 2030 }, "hash": "sha256:f4964f894f7cd2fdfd699c017b4bd25cffc322b03a5a88a36c682cf952832ccc", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -2259,13 +2259,13 @@ } ], "stats": { - "chars": 1372, - "words": 174, + "chars": 1350, + "words": 172, "headings": 2, - "estimated_token_count_total": 236 + "estimated_token_count_total": 230 }, - "hash": "sha256:3b0a9e8037c7634c33ac6674170bd763599fca914855d9d2fbf490d359140130", - "last_modified": "2025-10-23T16:29:24+00:00", + "hash": "sha256:f786ec04fd5c7179716a160f93f6bff3c497839cc3627e2279d9bec234ce0c3c", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -2326,13 +2326,13 @@ } ], "stats": { - "chars": 14731, - "words": 1881, + "chars": 13835, + "words": 1787, "headings": 9, - "estimated_token_count_total": 3342 + "estimated_token_count_total": 3074 }, - "hash": "sha256:9d6daa3f4daf149ae822b60060d14ff022bd4b3440cecdc969a48c105eb82a21", - "last_modified": "2025-10-23T16:29:25+00:00", + "hash": "sha256:50d751cce37dd3db81dcc3c6014dcc2b1dd7b0c95bfd78c2d69b6d06f6444e37", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -2384,7 +2384,7 @@ "estimated_token_count_total": 1572 }, "hash": "sha256:68fc67390e24741081c9a04d78951e76c7d4ff7cf6eddaba7dcbbdc1812c71d3", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -2441,7 +2441,7 @@ "estimated_token_count_total": 1559 }, "hash": "sha256:0024f5e4c12ab7b019e5ee183e7c78d175e1125868c5458b97d3accd9fac75bc", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -2467,13 +2467,13 @@ } ], "stats": { - "chars": 1243, - "words": 157, + "chars": 1221, + "words": 155, "headings": 2, - "estimated_token_count_total": 211 + "estimated_token_count_total": 205 }, - "hash": "sha256:0ce1fe38de00827a0735b9fa8076492205c2450c61da9fbd1937d9f38cfe7825", - "last_modified": "2025-10-23T16:29:25+00:00", + "hash": "sha256:16f4f67b56ecef53c3c7ab09c438dcc9d4e613b0824df5b1691bd7c4f6296eda", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -2505,7 +2505,7 @@ "estimated_token_count_total": 341 }, "hash": "sha256:94488b5170bf7d37c72418d53e24ac36b124abf0ab1cf125f16e75187e626e4e", - "last_modified": "2025-10-23T16:29:24+00:00", + "last_modified": "2025-10-24T21:11:57+00:00", "token_estimator": "heuristic-v1" }, { @@ -2548,7 +2548,7 @@ "estimated_token_count_total": 313 }, "hash": "sha256:dae93f5037ef8a3f508da802a016df748ce0aed69620348b9895f712609f7e84", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -2585,7 +2585,7 @@ "estimated_token_count_total": 505 }, "hash": "sha256:758eb0881fd029ab949ca4ac17ed0dd50c8f1e1e6109f7d0f36416a1082b21e7", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -2622,7 +2622,7 @@ "estimated_token_count_total": 570 }, "hash": "sha256:1247dfb5f5ac040bca81cd1002153e0ee53f4052b2a3d40b623834bd7f00d065", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -2699,7 +2699,7 @@ "estimated_token_count_total": 6228 }, "hash": "sha256:72e41f816f07026d96c803f399c71852aa1151c464e79cec3e1746b282d5eaae", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -2777,7 +2777,7 @@ "estimated_token_count_total": 4188 }, "hash": "sha256:fe008393aa37c27bb71b4483d4e2c4fbcda94f8c1be461fdd07eff40efbb4e26", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -2840,7 +2840,7 @@ "estimated_token_count_total": 1375 }, "hash": "sha256:8e6bfed5fa59bb748e80698ea702f62ce6951c48bdb955ee9ef0d3516e856887", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -2872,7 +2872,7 @@ "estimated_token_count_total": 323 }, "hash": "sha256:5c3a10769e30b4da62e6c188e99310354e6e9af4595c7920c2977a54b8e1853c", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3019,7 +3019,7 @@ "estimated_token_count_total": 1605 }, "hash": "sha256:89410eccd72495aa0a4eecf229c74a8f2db31994c6a03b9957c2be92ea227520", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3216,7 +3216,7 @@ "estimated_token_count_total": 9750 }, "hash": "sha256:1fb7a20bc4a799a771954720428029419ec73afa640e589590c43dd041a7e307", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3304,7 +3304,7 @@ "estimated_token_count_total": 4475 }, "hash": "sha256:f0cee7ccb3cd294e8f909a220bb63987239ef8155c187a04f8c4864ffdcde288", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3397,7 +3397,7 @@ "estimated_token_count_total": 3900 }, "hash": "sha256:a7541553a50a250521c0a280f997d614763c643b1028147f3fb61391950bda15", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3470,7 +3470,7 @@ "estimated_token_count_total": 3250 }, "hash": "sha256:bc771f912627fa09cad64adab1bc81c052f650d6c5a3b4f0c91883a98f6628da", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3543,7 +3543,7 @@ "estimated_token_count_total": 3033 }, "hash": "sha256:bc87533eaf42a979a0c17f50ecdc668c364889257c7e0d27b81129770660fd53", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3596,7 +3596,7 @@ "estimated_token_count_total": 2512 }, "hash": "sha256:5d13a0873a78a9802b06686d7caafbf4d23b6ba1edf7d3518943301f2b0110c4", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3628,7 +3628,7 @@ "estimated_token_count_total": 436 }, "hash": "sha256:fa9fb58c7fb7a1c86f147b9c95d3ef65a4aed6b559989dc2d439efec21a80be4", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3670,7 +3670,7 @@ "estimated_token_count_total": 2432 }, "hash": "sha256:809d0ff921587f29045df1d31a5a9fe32ee13fa7b9698aa27ff9f60b2aa7a4d8", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3743,7 +3743,7 @@ "estimated_token_count_total": 1118 }, "hash": "sha256:0468268436ffdb759cad8390a838d5fba2391118baa8fd8cd494b36397b10329", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3819,13 +3819,13 @@ } ], "stats": { - "chars": 18009, - "words": 2181, + "chars": 11136, + "words": 1417, "headings": 12, - "estimated_token_count_total": 3820 + "estimated_token_count_total": 2366 }, - "hash": "sha256:10c2497147b1d5404e8ab22432832d54fd0ebcf5eb36bbbe9e2d38308a1dfe72", - "last_modified": "2025-10-23T16:29:28+00:00", + "hash": "sha256:454593143212705cbf14c3c6277c42c3ca6aa631c204918e8332833f591887c3", + "last_modified": "2025-10-24T21:12:00+00:00", "token_estimator": "heuristic-v1" }, { @@ -3892,7 +3892,7 @@ "estimated_token_count_total": 2317 }, "hash": "sha256:605d2cbb7eabb2ea0fd928bc3ecdf9ee8b095e3dd9643f2b0918fef7b5a3f4a8", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -3934,7 +3934,7 @@ "estimated_token_count_total": 1223 }, "hash": "sha256:798353114d43dee2873e28b293876c0761e2fef596bc3327c5986a4343c70da1", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -3987,7 +3987,7 @@ "estimated_token_count_total": 1638 }, "hash": "sha256:807cee6869059dd933905d1cf6c76e3b86e02baee3de3113f7e5b4c8697fbd22", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4019,7 +4019,7 @@ "estimated_token_count_total": 190 }, "hash": "sha256:97ce7c57e3795b573b61011b0f5430da5a52380214d892ee58ee2aa61778caef", - "last_modified": "2025-10-23T16:29:25+00:00", + "last_modified": "2025-10-24T21:11:58+00:00", "token_estimator": "heuristic-v1" }, { @@ -4087,7 +4087,7 @@ "estimated_token_count_total": 2300 }, "hash": "sha256:ba24e31e2ad94fbf1d73f1878da92dd2e1476db00170780bbdf0e65ab18bc961", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4140,7 +4140,7 @@ "estimated_token_count_total": 1987 }, "hash": "sha256:2ca93b09d3bb9159bbf53816886a9b242bb3c13b996c51fd52962e049e2d5477", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4213,7 +4213,7 @@ "estimated_token_count_total": 1084 }, "hash": "sha256:7f533abe61586af8438e350c41b741d74a8edb839f9dc4139bc4619ba3748258", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4281,7 +4281,7 @@ "estimated_token_count_total": 1166 }, "hash": "sha256:ed3986f30880fefca5975fcdc847c68b4aca65862c63e3002b25391b0521781d", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4339,7 +4339,7 @@ "estimated_token_count_total": 942 }, "hash": "sha256:8987fc35cd28602054ee018031f773e2e3837425107c51d0e2ac68a94b86e9c0", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4392,7 +4392,7 @@ "estimated_token_count_total": 1945 }, "hash": "sha256:b8759f61ab57b636228b69d5770c74591998b912cd4596e89eb2ec011da7ef73", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4465,7 +4465,7 @@ "estimated_token_count_total": 2187 }, "hash": "sha256:56269d9ea47f5b4e92cd7d5a1e65ab06d181a9c380f90bb3ef285529b12299f7", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4491,13 +4491,13 @@ } ], "stats": { - "chars": 1096, - "words": 141, + "chars": 1074, + "words": 139, "headings": 2, - "estimated_token_count_total": 191 + "estimated_token_count_total": 185 }, - "hash": "sha256:9df26b2d1c10327a2880d45d6a704664926a42511b6c3ec9fc63d185bdfe563e", - "last_modified": "2025-10-23T16:29:28+00:00", + "hash": "sha256:900e54d04b11533efb15a34ebd76dc095a1873a926ecf2a5ce494cf0633c8be1", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4535,7 +4535,7 @@ "estimated_token_count_total": 428 }, "hash": "sha256:cfcc76bb24779c9b613f2c046b6f99a0f2529c25fd82287d804f6b945b936227", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4568,7 +4568,7 @@ "estimated_token_count_total": 245 }, "hash": "sha256:6d8e01281a5895fd2bc4438b24c170c72a496de0b838626a53e87685aea4aa25", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4625,7 +4625,7 @@ "estimated_token_count_total": 847 }, "hash": "sha256:a206dd86fc3d80aed22384000839ca0c9c75c69ad461abd9810d96c03cf6a3bd", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4697,7 +4697,7 @@ "estimated_token_count_total": 6312 }, "hash": "sha256:d132a135b7be0a571277eabd4f76781fe02de29d692c1e958560613ec25e891f", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4740,7 +4740,7 @@ "estimated_token_count_total": 834 }, "hash": "sha256:ef2cc8c69ca34dd35a012c361d5a7ce72dab888b4ef674f62310a1d3914c6554", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4772,7 +4772,7 @@ "estimated_token_count_total": 92 }, "hash": "sha256:0de8c1655a1524784010b6cec5fa522b2f764e32f18913f0d262283e0ec0779e", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4834,7 +4834,7 @@ "estimated_token_count_total": 5209 }, "hash": "sha256:966ec1bcc014a454f6b837b503025d9fb89c30f6a65d0aaec82ea5ff976e53a9", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4871,7 +4871,7 @@ "estimated_token_count_total": 683 }, "hash": "sha256:8dc107b7323ca24d3a781ca37b89580aa6a77232a4f6109ca24a1048cd733123", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4930,7 +4930,7 @@ "estimated_token_count_total": 1700 }, "hash": "sha256:47328231d6ff4dc52cd93aaf1baf5d0bc2d9fc372f3d79339d87aafa0dabd1b8", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -4957,7 +4957,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:c72d7d30a019fe1db8ab3993f91dfd4f1bdb4a932aaa685d3baaa0578091d5ce", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5025,7 +5025,7 @@ "estimated_token_count_total": 2453 }, "hash": "sha256:2c77cfb38bb2e466a8f56dabbb706fcd2e90cf1634fc9beb7f0ee95a75735653", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5052,7 +5052,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:cc49fdcc63a43247d80de2f309b9c7501d3054782746d80c003d95f3c43da90d", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5125,7 +5125,7 @@ "estimated_token_count_total": 2614 }, "hash": "sha256:4325cdd697814b8043db808da3dee86d3d9c6fc7dd523aae7fe8914d59d1b39c", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5156,13 +5156,13 @@ } ], "stats": { - "chars": 1495, - "words": 201, + "chars": 1473, + "words": 199, "headings": 3, - "estimated_token_count_total": 291 + "estimated_token_count_total": 285 }, - "hash": "sha256:b568596033cdf68e60d72bcb7ee62a794def2bd3ff5b3317ef15895f58a12c57", - "last_modified": "2025-10-23T16:29:28+00:00", + "hash": "sha256:340c8e81fdaca8a1b85a9addeed75cc617513b39658250693e2516c74b86aa6e", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5193,13 +5193,13 @@ } ], "stats": { - "chars": 1295, - "words": 176, + "chars": 1284, + "words": 175, "headings": 3, - "estimated_token_count_total": 183 + "estimated_token_count_total": 180 }, - "hash": "sha256:67be1f6e1199f4ef40e8d5394e8d472f5289b5a9ad384647a03db98b79229c8f", - "last_modified": "2025-10-23T16:29:28+00:00", + "hash": "sha256:6e71534a424f6a08521b19c9b4cf668e495fb7c591463ffe63d1b03a8b17e435", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5257,7 +5257,7 @@ "estimated_token_count_total": 1430 }, "hash": "sha256:1284c42be692167e01bcc44e2e134ec20615402675fac26df246c00aa1588d80", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5325,7 +5325,7 @@ "estimated_token_count_total": 2018 }, "hash": "sha256:49866761ef638dd0683bb5558f5319b9568ff136295b3359580a6f478172c73f", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5373,7 +5373,7 @@ "estimated_token_count_total": 1001 }, "hash": "sha256:165d1f1606785801860e8af5ff4c2d9d393b3bc07d211fc2e6757c70445a8124", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5400,7 +5400,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:91de375b7f822ed56b5e6b4d609d0d36e806d3f77041b4e180b6679b10a3e1c8", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5462,7 +5462,7 @@ "estimated_token_count_total": 1869 }, "hash": "sha256:978b4f2d2888ab26edeae998f09bead9ecd05460d229c63a8b2b2f4475438c69", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5549,7 +5549,7 @@ "estimated_token_count_total": 1870 }, "hash": "sha256:3b766e00e55a224201bc6744386a6dabc7da54ed9199b16abab3b94cff449eca", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5642,7 +5642,7 @@ "estimated_token_count_total": 9871 }, "hash": "sha256:0d7e04fd952cc9d5bd8cdbfd87cc4004c5f95e896a16bc7f89dfc4caeac8f371", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5705,7 +5705,7 @@ "estimated_token_count_total": 2661 }, "hash": "sha256:04e85c4cddb58252f8253d78a3924bb56952dac2a3e9a057704a91a0d1f21d75", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5736,13 +5736,13 @@ } ], "stats": { - "chars": 1237, - "words": 164, + "chars": 1226, + "words": 163, "headings": 3, - "estimated_token_count_total": 193 + "estimated_token_count_total": 190 }, - "hash": "sha256:1355969b6b0e723b42815b960c15eb128e4d936d0d707cd66e43820cff765ee3", - "last_modified": "2025-10-23T16:29:28+00:00", + "hash": "sha256:1e474a9a1411a128abe943bdfabd8d5d27eaa7b52c5ba4c68379964fd27c6983", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5773,13 +5773,13 @@ } ], "stats": { - "chars": 1199, - "words": 157, + "chars": 1188, + "words": 156, "headings": 3, - "estimated_token_count_total": 171 + "estimated_token_count_total": 168 }, - "hash": "sha256:4fca64fa791400e9177f6cf3a913c8d041a9ea0c93e3a24d91478867da150288", - "last_modified": "2025-10-23T16:29:28+00:00", + "hash": "sha256:e26ea88a73f187ffbf9c7287f80b9e51604b92896b7c032b26b3d034d3c46b7d", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5811,7 +5811,7 @@ "estimated_token_count_total": 122 }, "hash": "sha256:3a3d8b02539e7aea22d26a8fb845b9e2d19ad3676220b521ab3f176128310698", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5838,7 +5838,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:20c667a337791538e3997f1f449bf69b248ccc4cc806e22615075f24fd3f0202", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5890,7 +5890,7 @@ "estimated_token_count_total": 1898 }, "hash": "sha256:45d815fe79b74b480570a333572fe5a0b0d2fdd33b4aabe2711d94094d6fee10", - "last_modified": "2025-10-23T16:29:21+00:00", + "last_modified": "2025-10-24T21:11:54+00:00", "token_estimator": "heuristic-v1" }, { @@ -5922,7 +5922,7 @@ "estimated_token_count_total": 2232 }, "hash": "sha256:5a8da69a5cea8bd598ee4d102b9abed5d1a29153802a567e22bb4ee720410b32", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -5994,7 +5994,7 @@ "estimated_token_count_total": 579 }, "hash": "sha256:4c33d0ec5026128b3bfdb1dfc1f4b29487404eaa8043071d536e8638356c6e1f", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6036,7 +6036,7 @@ "estimated_token_count_total": 557 }, "hash": "sha256:993e93b05c8fbdfc2f7510c61ac86bc4c2ff0f03e573695b2f260933c8b62f78", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6073,7 +6073,7 @@ "estimated_token_count_total": 280 }, "hash": "sha256:5bdc575ac798a971867a15651c2b4d5139bf0b1fe6854d1865deff280ae6d7f6", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6094,7 +6094,7 @@ "estimated_token_count_total": 0 }, "hash": "sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6161,7 +6161,7 @@ "estimated_token_count_total": 1044 }, "hash": "sha256:d84a5af1a0237a911d25a68c077f508ebbce608f673ef4f9055e8e434daa96b9", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6228,7 +6228,7 @@ "estimated_token_count_total": 4229 }, "hash": "sha256:abd9f939f68b068a18567b875c9f7e11d102c54fc02ca0e6ee8041c539061ed0", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6285,7 +6285,7 @@ "estimated_token_count_total": 1286 }, "hash": "sha256:0b43b452e9d709cb324bf51fd88c2fed8e6249534a7c2b852e1bd36bcb9b981a", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6327,7 +6327,7 @@ "estimated_token_count_total": 462 }, "hash": "sha256:c6087224da8140a4a5d8bbc3cb9b9b389ffd57f679dca7eb67df9f64649f0eaf", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6389,7 +6389,7 @@ "estimated_token_count_total": 1827 }, "hash": "sha256:1090b02689df5f4c59bb83f9c81436718d06e46f3b615bc655fef3c7b6c9fb02", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6471,7 +6471,7 @@ "estimated_token_count_total": 2559 }, "hash": "sha256:0857a9e83aefc6d3f04e8cb320ab82d35211bbd73d2eb2614cf7b97f8e6d36b9", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6552,13 +6552,13 @@ } ], "stats": { - "chars": 15774, - "words": 2425, + "chars": 14805, + "words": 2330, "headings": 13, - "estimated_token_count_total": 3827 + "estimated_token_count_total": 3667 }, - "hash": "sha256:e2567b7d5377c87984622cf93afe4bd8cedf46b80597736cf53f26b5f31c5065", - "last_modified": "2025-10-23T16:29:28+00:00", + "hash": "sha256:cf981bba31ab6a031539e0f85c079c20d5d3e202c05ec47564ee5e1563771852", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6605,7 +6605,7 @@ "estimated_token_count_total": 625 }, "hash": "sha256:9ab570299106336e5d75923b876247e8eb4a71851a77e84d68e0335e9da5e0a8", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6631,13 +6631,13 @@ } ], "stats": { - "chars": 1975, - "words": 257, + "chars": 1931, + "words": 253, "headings": 2, - "estimated_token_count_total": 416 + "estimated_token_count_total": 404 }, - "hash": "sha256:f86c2598c9296ca2d09e514ce512440b87f2f11476d2aa9bf0296d2c748b6c96", - "last_modified": "2025-10-23T16:29:28+00:00", + "hash": "sha256:dad21b50f3732256f1367c2e79857f102635f0ed3015ed4013d7d0ca4d8b3a99", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6749,7 +6749,7 @@ "estimated_token_count_total": 5832 }, "hash": "sha256:a7b5239c3be0341ced8f28146e240ff6061fded2e71094bd586beeb024684a50", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6801,7 +6801,7 @@ "estimated_token_count_total": 861 }, "hash": "sha256:97655248c65e816fdf3d85dab4ace7ca0c145c50f671c25c24627cfd7660c7a6", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6858,7 +6858,7 @@ "estimated_token_count_total": 1167 }, "hash": "sha256:b2e8abce15fc9df106a5e972f28c64f606f9dd50ba3a256093eb53bdd5126224", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6884,13 +6884,13 @@ } ], "stats": { - "chars": 1520, - "words": 203, + "chars": 1498, + "words": 201, "headings": 2, - "estimated_token_count_total": 236 + "estimated_token_count_total": 230 }, - "hash": "sha256:c4c79b14ccefef842c387775b22d2e42da7e136d495ccc2ea4dfed9dcd894667", - "last_modified": "2025-10-23T16:29:28+00:00", + "hash": "sha256:f2cced19ba2b0b1ea46fd2f2892d328ac4797a1253d12cf479c64a447d7ce1ee", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6937,7 +6937,7 @@ "estimated_token_count_total": 1477 }, "hash": "sha256:76500d1d63f4205a84f0bc5b7f9aec945781127d41c32927280ac74bc14f0296", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -6963,13 +6963,13 @@ } ], "stats": { - "chars": 1603, - "words": 218, + "chars": 1570, + "words": 215, "headings": 2, - "estimated_token_count_total": 319 + "estimated_token_count_total": 310 }, - "hash": "sha256:4771f0d30573e36eee1b14cdf3bbca70a4716a3ee8dc8726516487d01a87587c", - "last_modified": "2025-10-23T16:29:28+00:00", + "hash": "sha256:7fb05e7b43cd5413b248605912ae0c6e24fd6c1ca199e64059c686c49d8cc456", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7036,7 +7036,7 @@ "estimated_token_count_total": 3409 }, "hash": "sha256:abe6bedab04f463ec07f554977b8d6355a5d2fad9bcda01cbe58568152295daa", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7088,7 +7088,7 @@ "estimated_token_count_total": 2617 }, "hash": "sha256:7d43408276d811c96b7b081a7b9f4d884893282a230b564c9eb3be2fc7857565", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7114,13 +7114,13 @@ } ], "stats": { - "chars": 1824, - "words": 249, + "chars": 1791, + "words": 246, "headings": 2, - "estimated_token_count_total": 352 + "estimated_token_count_total": 343 }, - "hash": "sha256:073052dcb2d4852bd40338cce41f3aec28ec31f93b6170a4daa1c0c21d54b7cb", - "last_modified": "2025-10-23T16:29:28+00:00", + "hash": "sha256:20f272dbbeb2b50a5e240b53ac45bed797ea58aa03e27c89194c941d66d8accf", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7152,7 +7152,7 @@ "estimated_token_count_total": 451 }, "hash": "sha256:2670bfa3c72e6b28c780cecdd4402691e616c2ab75e1d02a53834477ff1fcff8", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7200,7 +7200,7 @@ "estimated_token_count_total": 1127 }, "hash": "sha256:a476a8f00a86860deb76ceb77de6eecc3d5ed17601841165a53176512ef7d1a6", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7264,7 +7264,7 @@ "estimated_token_count_total": 1867 }, "hash": "sha256:4681fa2a9a5e44a52035ac9e58fb2f5e2abb667c36df94bfb1d4575293129134", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7291,7 +7291,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:3b9160b166d9b42b124f3b07eb26bdc5499fbbace6f951095009a5eee7fccbb6", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7338,7 +7338,7 @@ "estimated_token_count_total": 619 }, "hash": "sha256:00be43ac8d666bbe15c5c2fa5a5085697d0bb5a6f341ebbb943a209f0be355df", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7400,7 +7400,7 @@ "estimated_token_count_total": 1440 }, "hash": "sha256:2d228c52844df8952520fafdd3e6f0e26bfd2f32b5ee60c6241cf7d38603643c", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7479,7 +7479,7 @@ "estimated_token_count_total": 2600 }, "hash": "sha256:603890033f956552f0d7b524d0186a8ee256ac76aabf608808dbc992f7d089ab", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7566,7 +7566,7 @@ "estimated_token_count_total": 2534 }, "hash": "sha256:191df9b098e17e9de4597c9f8ced8abbafdfabc7e0f5c0a94d767fc2c9d7742b", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7593,7 +7593,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:c29356358f095b0d413e4c6525146b3f1b0b900853aada2168e7e55cd8dd6641", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7725,7 +7725,7 @@ "estimated_token_count_total": 4105 }, "hash": "sha256:759ab6dea0ad03c3f627558ea186d9f32351fa559acde82931684efc2da59d46", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7777,7 +7777,7 @@ "estimated_token_count_total": 1218 }, "hash": "sha256:26c156146ef9743fc26c6499294ff14186f97edbc2a34f445d3366b72f7148ae", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7809,7 +7809,7 @@ "estimated_token_count_total": 424 }, "hash": "sha256:59ec351fbb8d3a392e90f4f5bf6b62f58b21d6d7a900c5e367e5d2e09ecb3aca", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7851,7 +7851,7 @@ "estimated_token_count_total": 1238 }, "hash": "sha256:6340c8a885d03adf633ae30438d8f21c195c276a48082657bb22dd53341f1cfb", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7919,7 +7919,7 @@ "estimated_token_count_total": 1657 }, "hash": "sha256:4f8573882bd0f9b0bbbb45efa313c2f3bb90e4f90d09a8276b4b99d56b4b01d5", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -7981,7 +7981,7 @@ "estimated_token_count_total": 876 }, "hash": "sha256:8239d1e8d8642cb7c10e9e5f971c99b999e9e4a87373b50bf4a691225c1e4702", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -8008,7 +8008,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:6ef13c197dd1865fcc1a405d67486f1d053534d576bb32fe47a442fd2c11b6cd", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -8040,7 +8040,7 @@ "estimated_token_count_total": 177 }, "hash": "sha256:ffda04c93c70ec7204be28b642fa6e51f6bf9436d4792ecd25136696683f0902", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -8377,7 +8377,7 @@ "estimated_token_count_total": 5271 }, "hash": "sha256:f0e04286eacf23b182186f23e9854c0cd251545b8a8d561d2503f962dbfe32c0", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -8419,7 +8419,7 @@ "estimated_token_count_total": 631 }, "hash": "sha256:baba9dd41091b792d09005d55d3df0bf65b35f42b40ebe63caf425a0978a22b0", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -8487,7 +8487,7 @@ "estimated_token_count_total": 1611 }, "hash": "sha256:62beec261e72529f70e07a641177d489d2c8872f9c9d618cbadf1ac0fd881986", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -8519,7 +8519,7 @@ "estimated_token_count_total": 233 }, "hash": "sha256:58fd5c8c092ee748c2979164f985a67071a6ccb88492e79cdad536363364c858", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -8611,13 +8611,13 @@ } ], "stats": { - "chars": 29648, - "words": 4201, + "chars": 28474, + "words": 4035, "headings": 15, - "estimated_token_count_total": 6521 + "estimated_token_count_total": 6243 }, - "hash": "sha256:1f9ce923b3ce296571fe63837c0d3c3c791a339ef02db09ead6b2b92e9d1bfd5", - "last_modified": "2025-10-23T16:29:29+00:00", + "hash": "sha256:af802947fa4194cbfec09160bb21ff61f013e6f43efa379a835f1f14de9ab8f1", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -8680,7 +8680,7 @@ "estimated_token_count_total": 1399 }, "hash": "sha256:bcad23a74d962cab72b54cdc090bf9ee0cd5ecf79f70fb642f154668c2743983", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -8778,7 +8778,7 @@ "estimated_token_count_total": 4464 }, "hash": "sha256:299597c39d0e4e4902be8e45b354fff78a862aa5799e4f16d16787a97a1e3da8", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -8911,7 +8911,7 @@ "estimated_token_count_total": 4714 }, "hash": "sha256:e858bf6f7cf6af0525ffa8c8f5533e18b8ce0d6bbcd1b2acad71d31137a5f6b4", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -8938,7 +8938,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:235f33cdb64494815dbb3eb58ea98c69935098684e1b34b6d15356bc54b082ea", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9036,7 +9036,7 @@ "estimated_token_count_total": 3782 }, "hash": "sha256:eb4da21d561e9fd9333d97805318f0e263f54570120d3852ce7eba64da604cc2", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9119,7 +9119,7 @@ "estimated_token_count_total": 1797 }, "hash": "sha256:259dcef86aadc513675258b665cc3940db65af6eb32a5db85da6ac339966fa60", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9192,7 +9192,7 @@ "estimated_token_count_total": 3213 }, "hash": "sha256:e448294b6e52291ac0add5fa6533572814e6cd27af42bdaccc2000b86f52d775", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9250,7 +9250,7 @@ "estimated_token_count_total": 780 }, "hash": "sha256:077e7e5bfc9509cf09f455959a5da7a74b7af69836b3c4b334692f32e306ddf1", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9324,7 +9324,7 @@ "estimated_token_count_total": 1473 }, "hash": "sha256:695c624a1d7a3ed6fea0f4f5c19bb2100be986cec29ba58edb4598b9e9b98494", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9397,7 +9397,7 @@ "estimated_token_count_total": 914 }, "hash": "sha256:8122e21c149d0863cfe3b37fc5606bcdb91668e9d265f0f05451a61ff70e4e93", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9450,7 +9450,7 @@ "estimated_token_count_total": 1394 }, "hash": "sha256:217a79109aff1607594a0238fd91bfa812827620887c4f063c7e0a7a37f967d6", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9477,7 +9477,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:1514316acba1e9bba82ae1c82b09481e9d03d286e6f5d93b66e5a85fd4be7bca", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9545,7 +9545,7 @@ "estimated_token_count_total": 1822 }, "hash": "sha256:db2b1806153242680043ced536f64fc8a2ed3c09adc1bec5aa287168b48e0994", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9608,7 +9608,7 @@ "estimated_token_count_total": 1178 }, "hash": "sha256:9a6b3fa6c005d75c25f0f683b7d8c3b65891454743b794c12b005f910b81609c", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9721,7 +9721,7 @@ "estimated_token_count_total": 5303 }, "hash": "sha256:6078ea5afa297470ab65e55d8da9001ab23d6191c591d708c5007338eb252eb8", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9789,7 +9789,7 @@ "estimated_token_count_total": 891 }, "hash": "sha256:b5acdc9acf0e44836b8a4518155eba7d16cc3b103c557a00970ffb1c44c3e9f6", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9842,7 +9842,7 @@ "estimated_token_count_total": 2553 }, "hash": "sha256:40e799ce83609d6935f058e92cbb2f4c927b31ffcc6d6d7d257423b8388453e6", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9895,7 +9895,7 @@ "estimated_token_count_total": 994 }, "hash": "sha256:6992c9a2d1b315b64d9782880105cf2d436750249a84577aceb95cc213863009", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9927,7 +9927,7 @@ "estimated_token_count_total": 148 }, "hash": "sha256:e8dac01e89b7aac4b887e962e91084c253f5ea25c1abc3a56355390d0c3201c8", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -9954,7 +9954,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:49be4b4b5289572086eaaaf9ccff3bee7879b534188331c9a8052b3fe5aa4933", - "last_modified": "2025-10-23T16:29:28+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10021,7 +10021,7 @@ "estimated_token_count_total": 2251 }, "hash": "sha256:98f8303886011fb13fe8e7a32a8a6150f68703ec7c2a863a21050a35aebf2f36", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10047,13 +10047,13 @@ } ], "stats": { - "chars": 1220, - "words": 171, + "chars": 1198, + "words": 169, "headings": 2, - "estimated_token_count_total": 202 + "estimated_token_count_total": 196 }, - "hash": "sha256:479cbefd4369c5c83ff5ed3aca93db88889fd93f1ffaed7303f60fcd54fc77ce", - "last_modified": "2025-10-23T16:29:29+00:00", + "hash": "sha256:fb892e81a2add1b64214c6cabe837d5068ebe54a7fb65e9149edbfb68f578a53", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10165,7 +10165,7 @@ "estimated_token_count_total": 4844 }, "hash": "sha256:96acff10be56dea76acdb5c915c1dde0eb15eb12eb95e7871eef56bab6cda273", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10237,7 +10237,7 @@ "estimated_token_count_total": 2375 }, "hash": "sha256:61bc251929352f2299ca1d413d05aa9c3672b914575a285d73c7ba53dbd75bff", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10289,7 +10289,7 @@ "estimated_token_count_total": 1461 }, "hash": "sha256:370ed10155cee84889a6d230d0bc3476597448f88a2a271ab87ef893a3268c18", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10320,13 +10320,13 @@ } ], "stats": { - "chars": 1808, - "words": 239, + "chars": 1797, + "words": 238, "headings": 3, - "estimated_token_count_total": 208 + "estimated_token_count_total": 205 }, - "hash": "sha256:d32d46c63294ca7bbdb46d7b64fe28a6744910557c10b02c7fc3a55d53e577b4", - "last_modified": "2025-10-23T16:29:29+00:00", + "hash": "sha256:c1f893d4086b0bf5d6b3c50d0c6cffe27d4deddf1240b250df6432eddcec969c", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10398,7 +10398,7 @@ "estimated_token_count_total": 7755 }, "hash": "sha256:086a87823ab67ceac102358030e316583cd733c0ec326316e7f29061fe7f6934", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10460,7 +10460,7 @@ "estimated_token_count_total": 2770 }, "hash": "sha256:581c8ac75aed22373939de6ad8396ee6e2840fbbaf495e81daf115d444a53017", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10487,7 +10487,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:09264d36777b5acb8cb1f3811e7f2bbe58641e0aac3afd74e426a2533dc9fa61", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10523,13 +10523,13 @@ } ], "stats": { - "chars": 2197, - "words": 272, + "chars": 2175, + "words": 270, "headings": 4, - "estimated_token_count_total": 349 + "estimated_token_count_total": 343 }, - "hash": "sha256:ba83e50c58f45330ce0a74e27d3f764cc1c710eda0fb3d4674b24cf9c87ff6ad", - "last_modified": "2025-10-23T16:29:29+00:00", + "hash": "sha256:dcb1210e19815f3659ea283f8fc04dd5b85bc440a54e31e889f5ec5ae9229b78", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10611,7 +10611,7 @@ "estimated_token_count_total": 34507 }, "hash": "sha256:21ec1fdbd5e12b831a0c11e24ad4a4917e9d3dc9485d218c2ddc4372fc76e24b", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10637,13 +10637,13 @@ } ], "stats": { - "chars": 883, - "words": 114, + "chars": 872, + "words": 113, "headings": 2, - "estimated_token_count_total": 129 + "estimated_token_count_total": 126 }, - "hash": "sha256:cb856d135b9bcbc3c1c1a2713c70baf2d7a979a918d2f2d27fee82341412bb2b", - "last_modified": "2025-10-23T16:29:29+00:00", + "hash": "sha256:b9c07713604ff9658363bf5e7a0726ecb7781418826ff65abffddffc8083d33f", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -10690,13 +10690,13 @@ } ], "stats": { - "chars": 13089, - "words": 1526, + "chars": 10973, + "words": 1287, "headings": 6, - "estimated_token_count_total": 3099 + "estimated_token_count_total": 2505 }, - "hash": "sha256:d2f3ab658ab29514ac161b17df23e0e7c1f63a7fa4fefcef451ef80b413ab757", - "last_modified": "2025-10-23T16:29:30+00:00", + "hash": "sha256:e074f9f36f0699a69f1613ff0bfa5d6e01619bdf653ea98196209947d10e205e", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -10789,7 +10789,7 @@ "estimated_token_count_total": 5338 }, "hash": "sha256:b3530f5fc5c9e916181dbc259a7fbae9c60100cb0450fc6d47bbb0d140afa075", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -10861,7 +10861,7 @@ "estimated_token_count_total": 4362 }, "hash": "sha256:a66380d109832bbcc12aee24399a728c8213fa2cf6912b7fe563034fc4775593", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -10918,7 +10918,7 @@ "estimated_token_count_total": 2138 }, "hash": "sha256:ff2c267284959711782c0d6ecb4b439c3a6cc31f763d5e1ff2cc3b1f6efb62b2", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -10970,7 +10970,7 @@ "estimated_token_count_total": 2929 }, "hash": "sha256:5d455d265430e71b2a8a8f5d4ba64caab329b5f6e959375ca7c5f8580ed00003", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11062,7 +11062,7 @@ "estimated_token_count_total": 4791 }, "hash": "sha256:24d101e192069fd4a1a829a7197c8878af0eabeca8ef5c3100ddbe46ec116488", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11144,7 +11144,7 @@ "estimated_token_count_total": 2559 }, "hash": "sha256:b446e5283fb0399f16b23a269bbbe8cca7ed08274fae7611e9e3a7aa921ae662", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11207,7 +11207,7 @@ "estimated_token_count_total": 3257 }, "hash": "sha256:5da581453e1e0b0b9659c2c74fe2c0c3e8060fa04445dd389644626e4232916d", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11234,7 +11234,7 @@ "estimated_token_count_total": 42 }, "hash": "sha256:06acc146698c1d3224544987d7ee52da498e3179228f98a494e385c5786a3a2c", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11271,7 +11271,7 @@ "estimated_token_count_total": 107 }, "hash": "sha256:fdd391227992c966de25b9240f5492135a9993859ec42b77952c1aa3d2e39ed9", - "last_modified": "2025-10-23T16:29:29+00:00", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -11338,7 +11338,7 @@ "estimated_token_count_total": 4242 }, "hash": "sha256:2f11054e0d31c003ebae5d990b559bd56741d190ca409f6ad060216245fa2d17", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11395,7 +11395,7 @@ "estimated_token_count_total": 2263 }, "hash": "sha256:a6a535f4f5e145d3e2a7518739f752ee3ed37b7745483f414e21c97792331d18", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11443,7 +11443,7 @@ "estimated_token_count_total": 1571 }, "hash": "sha256:3ad540d8ad636304705cccb08bc1fdf21fe2fc7dc0f99bd509b23ae96d20e0ba", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11535,7 +11535,7 @@ "estimated_token_count_total": 2140 }, "hash": "sha256:388c988338ed84589c546bb1606d08641fb931dae307d3df92aeccd2e4986080", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11571,13 +11571,13 @@ } ], "stats": { - "chars": 1778, - "words": 250, + "chars": 1767, + "words": 249, "headings": 4, - "estimated_token_count_total": 400 + "estimated_token_count_total": 397 }, - "hash": "sha256:20879ce95cc9ef66354350e0005642b0b3b35e412c0d4e34ad49aec3fd1a548c", - "last_modified": "2025-10-23T16:29:30+00:00", + "hash": "sha256:9044f2d9bca77f3e0062a47a52c592c756384850b051cb5be4f9373cff79440d", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11614,7 +11614,7 @@ "estimated_token_count_total": 229 }, "hash": "sha256:9d2299b006c2393409ba46f729c6970f9e4d485d5164be6465f5f390969cd881", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11683,7 +11683,7 @@ "estimated_token_count_total": 2737 }, "hash": "sha256:b3bae6a7538228ab76099223b4712df93d53727f2a559faf60516a3cd0165178", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11752,7 +11752,7 @@ "estimated_token_count_total": 2580 }, "hash": "sha256:b6570ad1b32bb07cf0ae33442e6189e1837bfa28f687e589ca189851aa292091", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11784,7 +11784,7 @@ "estimated_token_count_total": 80 }, "hash": "sha256:1dfbb8c3cfa27f92e982b4ce705415e117c50eb38f641691129863b474741da7", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11815,13 +11815,13 @@ } ], "stats": { - "chars": 1489, - "words": 215, + "chars": 1478, + "words": 214, "headings": 3, - "estimated_token_count_total": 258 + "estimated_token_count_total": 255 }, - "hash": "sha256:60a52164328f3776b04f171212dc2407aa991f8f5b83a2d65855c96ca9ea06b2", - "last_modified": "2025-10-23T16:29:29+00:00", + "hash": "sha256:189ac2cf8bfd44fc76a6f45f504effe4ea11653c5d86c7fa825b918fdbd4f564", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" }, { @@ -11879,7 +11879,7 @@ "estimated_token_count_total": 2670 }, "hash": "sha256:07629376480e74afc7fe4d91df539b6ab22453df0f8143df11cc51ef9a78f736", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11906,7 +11906,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:94dbafb2d78b87d5f0f0c75de002501b8210ac8d66072bc07989f685837cbac5", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -11960,7 +11960,7 @@ "estimated_token_count_total": 2056 }, "hash": "sha256:cf9197d6909dd8865e8838cad95e3692fefaecc3d2f4773b26809a02051d620f", - "last_modified": "2025-10-23T16:29:30+00:00", + "last_modified": "2025-10-24T21:12:03+00:00", "token_estimator": "heuristic-v1" }, { @@ -12014,7 +12014,7 @@ "estimated_token_count_total": 2220 }, "hash": "sha256:aa6371024bb78c3eeedb6820a37859670046fd0e4f756ad417b20c39fb2983b9", - "last_modified": "2025-10-23T16:29:31+00:00", + "last_modified": "2025-10-24T21:12:04+00:00", "token_estimator": "heuristic-v1" }, { @@ -12061,13 +12061,13 @@ } ], "stats": { - "chars": 8670, - "words": 1178, + "chars": 7480, + "words": 1056, "headings": 6, - "estimated_token_count_total": 1760 + "estimated_token_count_total": 1560 }, - "hash": "sha256:4f3e2e50a595b0f93078ce0f63185a6f367ae341ae8afac10709c061c54b3307", - "last_modified": "2025-10-23T16:29:31+00:00", + "hash": "sha256:31376f2d82192ea9199296355eb227b24ffe3c2dfefe38fdaa17177f269f9684", + "last_modified": "2025-10-24T21:12:04+00:00", "token_estimator": "heuristic-v1" }, { @@ -12134,13 +12134,13 @@ } ], "stats": { - "chars": 21978, - "words": 2569, + "chars": 7495, + "words": 1119, "headings": 10, - "estimated_token_count_total": 5126 + "estimated_token_count_total": 1759 }, - "hash": "sha256:12cbf7c21f969c771ab046ddf836a1967f80e89e8e379d30e4ecdced43d3b160", - "last_modified": "2025-10-23T16:29:34+00:00", + "hash": "sha256:a443b1b89ce287cbaf3cdb5da04e843b8809eb46fb869cbaa17ce1bd92f6b7ab", + "last_modified": "2025-10-24T21:12:08+00:00", "token_estimator": "heuristic-v1" }, { @@ -12217,13 +12217,13 @@ } ], "stats": { - "chars": 27950, - "words": 3205, + "chars": 16479, + "words": 2032, "headings": 12, - "estimated_token_count_total": 6212 + "estimated_token_count_total": 3550 }, - "hash": "sha256:1c637fe7a53d42add1a2ee09091d79f2086f2fe3091674e58b40574ae49bb2ec", - "last_modified": "2025-10-23T16:29:37+00:00", + "hash": "sha256:c3ab578ee1ceaf34852be04c882fc323cc422a894894ba71687f32863b701010", + "last_modified": "2025-10-24T21:12:13+00:00", "token_estimator": "heuristic-v1" }, { @@ -12280,13 +12280,13 @@ } ], "stats": { - "chars": 18691, - "words": 2236, + "chars": 17542, + "words": 2148, "headings": 8, - "estimated_token_count_total": 4130 + "estimated_token_count_total": 3873 }, - "hash": "sha256:576067f5cd3a315dea7998f9bb729c93646e38b586f73402d05be6d75e38c80f", - "last_modified": "2025-10-23T16:29:40+00:00", + "hash": "sha256:29bcad2dfbad9a6407097d5df7268c6e4beb4e6314d6bf832b295d2be5203136", + "last_modified": "2025-10-24T21:12:15+00:00", "token_estimator": "heuristic-v1" }, { @@ -12318,7 +12318,7 @@ "estimated_token_count_total": 77 }, "hash": "sha256:8d8fc5f794d4c793586cd3d412627f5e2fe76f182c75c3687abcf33deed5d65e", - "last_modified": "2025-10-23T16:29:37+00:00", + "last_modified": "2025-10-24T21:12:13+00:00", "token_estimator": "heuristic-v1" }, { @@ -12355,7 +12355,7 @@ "estimated_token_count_total": 130 }, "hash": "sha256:66bc34a12c50539dde2ffc69fe66891f73d3e1a2da5833ada15e26744ff32209", - "last_modified": "2025-10-23T16:29:31+00:00", + "last_modified": "2025-10-24T21:12:04+00:00", "token_estimator": "heuristic-v1" }, { @@ -12391,13 +12391,13 @@ } ], "stats": { - "chars": 2498, - "words": 357, + "chars": 2443, + "words": 352, "headings": 4, - "estimated_token_count_total": 594 + "estimated_token_count_total": 579 }, - "hash": "sha256:3708031bfcbb55206c4a9aed14f4afbb9f742ea02ebb70dc350390f484c91a0b", - "last_modified": "2025-10-23T16:29:29+00:00", + "hash": "sha256:eb544bbab067f4b3516190c6fe2df01049970bebb409095af4d3fd9b8bb771fe", + "last_modified": "2025-10-24T21:12:01+00:00", "token_estimator": "heuristic-v1" } ] \ No newline at end of file diff --git a/llms-full.jsonl b/llms-full.jsonl index 0aeaf992d..0a3d748ba 100644 --- a/llms-full.jsonl +++ b/llms-full.jsonl @@ -11,11 +11,11 @@ {"page_id": "develop-interoperability-send-messages", "page_title": "Send XCM Messages", "index": 3, "depth": 2, "title": "Primary Extrinsics of the XCM Pallet", "anchor": "primary-extrinsics-of-the-xcm-pallet", "start_char": 3620, "end_char": 3820, "estimated_token_count": 35, "token_estimator": "heuristic-v1", "text": "## Primary Extrinsics of the XCM Pallet\n\nThis page will highlight the two **Primary Primitive Calls** responsible for sending and executing XCVM programs as dispatchable functions within the pallet."} {"page_id": "develop-interoperability-send-messages", "page_title": "Send XCM Messages", "index": 4, "depth": 3, "title": "Execute", "anchor": "execute", "start_char": 3820, "end_char": 5071, "estimated_token_count": 298, "token_estimator": "heuristic-v1", "text": "### Execute\n\nThe [`execute`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/enum.Call.html#variant.execute){target=\\_blank} call directly interacts with the XCM executor, allowing for the execution of XCM messages originating from a locally signed origin. The executor validates the message, ensuring it complies with any configured barriers or filters before executing.\n\nOnce validated, the message is executed locally, and an event is emitted to indicate the result—whether the message was fully executed or only partially completed. Execution is capped by a maximum weight ([`max_weight`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/enum.Call.html#variant.execute.field.max_weight){target=\\_blank}); if the required weight exceeds this limit, the message will not be executed.\n\n```rust\npub fn execute(\n message: Box::RuntimeCall>>,\n max_weight: Weight,\n)\n```\n\nFor further details about the `execute` extrinsic, see the [`pallet-xcm` documentation](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/struct.Pallet.html){target=\\_blank}.\n\n!!!warning\n Partial execution of messages may occur depending on the constraints or barriers applied."} {"page_id": "develop-interoperability-send-messages", "page_title": "Send XCM Messages", "index": 5, "depth": 3, "title": "Send", "anchor": "send", "start_char": 5071, "end_char": 6081, "estimated_token_count": 254, "token_estimator": "heuristic-v1", "text": "### Send\n\nThe [`send`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/enum.Call.html#variant.send){target=\\_blank} call enables XCM messages to be sent to a specified destination. This could be a parachain, smart contract, or any external system governed by consensus. Unlike the execute call, the message is not executed locally but is transported to the destination chain for processing.\n\nThe destination is defined using a [Location](https://paritytech.github.io/polkadot-sdk/master/xcm_docs/glossary/index.html#location){target=\\_blank}, which describes the target chain or system. This ensures precise delivery through the configured XCM transport mechanism.\n\n```rust\npub fn send(\n dest: Box,\n message: Box::RuntimeCall>>,\n)\n```\n\nFor further information about the `send` extrinsic, see the [`pallet-xcm` documentation](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/struct.Pallet.html){target=\\_blank}."} -{"page_id": "develop-interoperability-send-messages", "page_title": "Send XCM Messages", "index": 6, "depth": 2, "title": "XCM Router", "anchor": "xcm-router", "start_char": 6081, "end_char": 7146, "estimated_token_count": 240, "token_estimator": "heuristic-v1", "text": "## XCM Router\n\nThe [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/trait.Config.html#associatedtype.XcmRouter){target=\\_blank} is a critical component the XCM pallet requires to facilitate sending XCM messages. It defines where messages can be sent and determines the appropriate XCM transport protocol for the operation.\n\nFor instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\\_blank} from the relay chain to parachains, ensuring secure and controlled communication.\n\n```rust\npub type PriceForChildParachainDelivery =\n\tExponentialPrice;\n```\n\nFor more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\\_blank} page."} +{"page_id": "develop-interoperability-send-messages", "page_title": "Send XCM Messages", "index": 6, "depth": 2, "title": "XCM Router", "anchor": "xcm-router", "start_char": 6081, "end_char": 7032, "estimated_token_count": 225, "token_estimator": "heuristic-v1", "text": "## XCM Router\n\nThe [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/trait.Config.html#associatedtype.XcmRouter){target=\\_blank} is a critical component the XCM pallet requires to facilitate sending XCM messages. It defines where messages can be sent and determines the appropriate XCM transport protocol for the operation.\n\nFor instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\\_blank} from the relay chain to parachains, ensuring secure and controlled communication.\n\n```rust\n\n```\n\nFor more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\\_blank} page."} {"page_id": "develop-interoperability-test-and-debug", "page_title": "Testing and Debugging", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 25, "end_char": 875, "estimated_token_count": 162, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nCross-Consensus Messaging (XCM) is a core feature of the Polkadot ecosystem, enabling communication between parachains, relay chains, and system chains. To ensure the reliability of XCM-powered blockchains, thorough testing and debugging are essential before production deployment.\n\nThis guide covers the XCM Emulator, a tool designed to facilitate onboarding and testing for developers. Use the emulator if:\n\n- A live runtime is not yet available.\n- Extensive configuration adjustments are needed, as emulated chains differ from live networks.\n- Rust-based tests are preferred for automation and integration.\n\nFor scenarios where real blockchain state is required, [Chopsticks](/tutorials/polkadot-sdk/testing/fork-live-chains/#xcm-testing){target=\\_blank} allows testing with any client compatible with Polkadot SDK-based chains."} {"page_id": "develop-interoperability-test-and-debug", "page_title": "Testing and Debugging", "index": 1, "depth": 2, "title": "XCM Emulator", "anchor": "xcm-emulator", "start_char": 875, "end_char": 2026, "estimated_token_count": 225, "token_estimator": "heuristic-v1", "text": "## XCM Emulator\n\nSetting up a live network with multiple interconnected parachains for XCM testing can be complex and resource-intensive. \n\nThe [`xcm-emulator`](https://github.com/paritytech/polkadot-sdk/tree/polkadot-stable2506-2/cumulus/xcm/xcm-emulator){target=\\_blank} is a tool designed to simulate the execution of XCM programs using predefined runtime configurations. These configurations include those utilized by live networks like Kusama, Polkadot, and Asset Hub.\n\nThis tool enables testing of cross-chain message passing, providing a way to verify outcomes, weights, and side effects efficiently. It achieves this by utilizing mocked runtimes for both the relay chain and connected parachains, enabling developers to focus on message logic and configuration without needing a live network.\n\nThe `xcm-emulator` relies on transport layer pallets. However, the messages do not leverage the same messaging infrastructure as live networks since the transport mechanism is mocked. Additionally, consensus-related events are not covered, such as disputes and staking events. Parachains should use end-to-end (E2E) tests to validate these events."} {"page_id": "develop-interoperability-test-and-debug", "page_title": "Testing and Debugging", "index": 2, "depth": 3, "title": "Advantages and Limitations", "anchor": "advantages-and-limitations", "start_char": 2026, "end_char": 3130, "estimated_token_count": 219, "token_estimator": "heuristic-v1", "text": "### Advantages and Limitations\n\nThe XCM Emulator provides both advantages and limitations when testing cross-chain communication in simulated environments.\n\n- **Advantages**:\n - **Interactive debugging**: Offers tracing capabilities similar to EVM, enabling detailed analysis of issues.\n - **Runtime composability**: Facilitates testing and integration of multiple runtime components.\n - **Immediate feedback**: Supports Test-Driven Development (TDD) by providing rapid test results.\n - **Seamless integration testing**: Simplifies the process of testing new runtime versions in an isolated environment.\n\n- **Limitations**:\n - **Simplified emulation**: Always assumes message delivery, which may not mimic real-world network behavior.\n - **Dependency challenges**: Requires careful management of dependency versions and patching. Refer to the [Cargo dependency documentation](https://doc.rust-lang.org/cargo/reference/overriding-dependencies.html){target=\\_blank}.\n - **Compilation overhead**: Testing environments can be resource-intensive, requiring frequent compilation updates."} -{"page_id": "develop-interoperability-test-and-debug", "page_title": "Testing and Debugging", "index": 3, "depth": 3, "title": "How Does It Work?", "anchor": "how-does-it-work", "start_char": 3130, "end_char": 7729, "estimated_token_count": 881, "token_estimator": "heuristic-v1", "text": "### How Does It Work?\n\nThe `xcm-emulator` provides macros for defining a mocked testing environment. Check all the existing macros and functionality in the [XCM Emulator source code](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs){target=\\_blank}. The most important macros are:\n\n- **[`decl_test_relay_chains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L361){target=\\_blank}**: Defines runtime and configuration for the relay chains. Example:\n\n ```rust\n decl_test_relay_chains! {\n \t#[api_version(13)]\n \tpub struct Westend {\n \t\tgenesis = genesis::genesis(),\n \t\ton_init = (),\n \t\truntime = westend_runtime,\n \t\tcore = {\n \t\t\tSovereignAccountOf: westend_runtime::xcm_config::LocationConverter,\n \t\t},\n \t\tpallets = {\n \t\t\tXcmPallet: westend_runtime::XcmPallet,\n \t\t\tSudo: westend_runtime::Sudo,\n \t\t\tBalances: westend_runtime::Balances,\n \t\t\tTreasury: westend_runtime::Treasury,\n \t\t\tAssetRate: westend_runtime::AssetRate,\n \t\t\tHrmp: westend_runtime::Hrmp,\n \t\t\tIdentity: westend_runtime::Identity,\n \t\t\tIdentityMigrator: westend_runtime::IdentityMigrator,\n \t\t}\n \t},\n }\n ```\n\n- **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\\_blank}**: Defines runtime and configuration for parachains. Example:\n\n ```rust\n decl_test_parachains! {\n \tpub struct AssetHubWestend {\n \t\tgenesis = genesis::genesis(),\n \t\ton_init = {\n \t\t\tasset_hub_westend_runtime::AuraExt::on_initialize(1);\n \t\t},\n \t\truntime = asset_hub_westend_runtime,\n \t\tcore = {\n \t\t\tXcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue,\n \t\t\tLocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId,\n \t\t\tParachainInfo: asset_hub_westend_runtime::ParachainInfo,\n \t\t\tMessageOrigin: cumulus_primitives_core::AggregateMessageOrigin,\n \t\t\tDigestProvider: (),\n \t\t},\n \t\tpallets = {\n \t\t\tPolkadotXcm: asset_hub_westend_runtime::PolkadotXcm,\n \t\t\tBalances: asset_hub_westend_runtime::Balances,\n \t\t\tAssets: asset_hub_westend_runtime::Assets,\n \t\t\tForeignAssets: asset_hub_westend_runtime::ForeignAssets,\n \t\t\tPoolAssets: asset_hub_westend_runtime::PoolAssets,\n \t\t\tAssetConversion: asset_hub_westend_runtime::AssetConversion,\n \t\t\tSnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend,\n \t\t\tRevive: asset_hub_westend_runtime::Revive,\n \t\t}\n \t},\n }\n ```\n\n- **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example:\n\n ```rust\n decl_test_bridges! {\n \tpub struct RococoWestendMockBridge {\n \t\tsource = BridgeHubRococoPara,\n \t\ttarget = BridgeHubWestendPara,\n \t\thandler = RococoWestendMessageHandler\n \t},\n \tpub struct WestendRococoMockBridge {\n \t\tsource = BridgeHubWestendPara,\n \t\ttarget = BridgeHubRococoPara,\n \t\thandler = WestendRococoMessageHandler\n \t}\n }\n ```\n\n- **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example:\n\n ```rust\n decl_test_networks! {\n \tpub struct WestendMockNet {\n \t\trelay_chain = Westend,\n \t\tparachains = vec![\n \t\t\tAssetHubWestend,\n \t\t\tBridgeHubWestend,\n \t\t\tCollectivesWestend,\n \t\t\tCoretimeWestend,\n \t\t\tPeopleWestend,\n \t\t\tPenpalA,\n \t\t\tPenpalB,\n \t\t],\n \t\tbridge = ()\n \t},\n }\n ```\n\nBy leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\\_blank} article. \n\nThis framework enables thorough testing of runtime and cross-chain interactions, enabling developers to effectively design, test, and optimize cross-chain functionality.\n\nTo see a complete example of implementing and executing tests, refer to the [integration tests](https://github.com/paritytech/polkadot-sdk/tree/polkadot-stable2506-2/cumulus/parachains/integration-tests/emulated){target=\\_blank} in the Polkadot SDK repository."} +{"page_id": "develop-interoperability-test-and-debug", "page_title": "Testing and Debugging", "index": 3, "depth": 3, "title": "How Does It Work?", "anchor": "how-does-it-work", "start_char": 3130, "end_char": 5974, "estimated_token_count": 665, "token_estimator": "heuristic-v1", "text": "### How Does It Work?\n\nThe `xcm-emulator` provides macros for defining a mocked testing environment. Check all the existing macros and functionality in the [XCM Emulator source code](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs){target=\\_blank}. The most important macros are:\n\n- **[`decl_test_relay_chains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L361){target=\\_blank}**: Defines runtime and configuration for the relay chains. Example:\n\n ```rust\n decl_test_relay_chains! {\n \t#[api_version(13)]\n \tpub struct Westend {\n \t\tgenesis = genesis::genesis(),\n \t\ton_init = (),\n \t\truntime = westend_runtime,\n \t\tcore = {\n \t\t\tSovereignAccountOf: westend_runtime::xcm_config::LocationConverter,\n \t\t},\n \t\tpallets = {\n \t\t\tXcmPallet: westend_runtime::XcmPallet,\n \t\t\tSudo: westend_runtime::Sudo,\n \t\t\tBalances: westend_runtime::Balances,\n \t\t\tTreasury: westend_runtime::Treasury,\n \t\t\tAssetRate: westend_runtime::AssetRate,\n \t\t\tHrmp: westend_runtime::Hrmp,\n \t\t\tIdentity: westend_runtime::Identity,\n \t\t\tIdentityMigrator: westend_runtime::IdentityMigrator,\n \t\t}\n \t},\n }\n ```\n\n- **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\\_blank}**: Defines runtime and configuration for parachains. Example:\n\n ```rust\n \n ```\n\n- **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example:\n\n ```rust\n \n ```\n\n- **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example:\n\n ```rust\n \n ```\n\nBy leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\\_blank} article. \n\nThis framework enables thorough testing of runtime and cross-chain interactions, enabling developers to effectively design, test, and optimize cross-chain functionality.\n\nTo see a complete example of implementing and executing tests, refer to the [integration tests](https://github.com/paritytech/polkadot-sdk/tree/polkadot-stable2506-2/cumulus/parachains/integration-tests/emulated){target=\\_blank} in the Polkadot SDK repository."} {"page_id": "develop-interoperability-versions-v5-asset-claimer", "page_title": "Asset claimer", "index": 0, "depth": 2, "title": "The problem before v5", "anchor": "the-problem-before-v5", "start_char": 446, "end_char": 932, "estimated_token_count": 102, "token_estimator": "heuristic-v1", "text": "## The problem before v5\n\nWhen XCM execution failed and assets became trapped:\n\n- **Governance dependency**: Most trapped asset recovery requires governance proposals.\n- **Complex procedures**: Manual intervention through referendum processes.\n- **Long delays**: Recovery could take weeks or months through governance.\n- **Risk of loss**: Assets could remain permanently trapped if governance didn't act.\n- **High barriers**: Small amounts often weren't worth the governance overhead."} {"page_id": "develop-interoperability-versions-v5-asset-claimer", "page_title": "Asset claimer", "index": 1, "depth": 2, "title": "The V5 Solution: `AssetClaimer` Hint", "anchor": "the-v5-solution-assetclaimer-hint", "start_char": 932, "end_char": 1343, "estimated_token_count": 101, "token_estimator": "heuristic-v1", "text": "## The V5 Solution: `AssetClaimer` Hint\n\nThe new [`AssetClaimer`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Hint.html#variant.AssetClaimer){target=\\_blank} hint allows XCM programs to preemptively designate who can claim trapped assets:\n\n```typescript\n// Set asset claimer before risky operations\nXcmV5Instruction.SetHints({ \n hints: [Enum('AssetClaimer', claimerLocation)] \n})\n```"} {"page_id": "develop-interoperability-versions-v5-asset-claimer", "page_title": "Asset claimer", "index": 2, "depth": 2, "title": "How it Improves the Situation", "anchor": "how-it-improves-the-situation", "start_char": 1343, "end_char": 2490, "estimated_token_count": 222, "token_estimator": "heuristic-v1", "text": "## How it Improves the Situation\n\nThe `AssetClaimer` hint transforms the recovery process by allowing proactive designation of claimers, eliminating the need for governance intervention in most cases.\n\n- **Before XCM V5:**\n\n ```typescript\n // If this XCM fails, assets become trapped\n const riskyXcm = [\n XcmInstruction.WithdrawAsset([assets]),\n XcmInstruction.BuyExecution({ fees, weight_limit }),\n XcmInstruction.Transact({ /* risky call */ }),\n XcmInstruction.DepositAsset({ assets, beneficiary })\n ]\n\n // Recovery required governance intervention\n ```\n\n- **With XCM V5:**\n\n ```typescript\n // Proactive asset claimer setup\n const saferXcm = [\n // Anyone can now claim if execution fails\n XcmV5Instruction.SetHints({ \n hints: [Enum('AssetClaimer', claimerLocation)] \n }),\n XcmV5Instruction.WithdrawAsset([assets]),\n XcmV5Instruction.PayFees({ asset }),\n XcmV5Instruction.Transact({ /* risky call */ }),\n XcmV5Instruction.DepositAsset({ assets, beneficiary })\n ]\n\n // Recovery can be done immediately by the claimer\n ```"} @@ -104,18 +104,18 @@ {"page_id": "develop-interoperability-xcm-guides-from-apps-transfers", "page_title": "Transfers", "index": 3, "depth": 2, "title": "Origin Preservation", "anchor": "origin-preservation", "start_char": 11158, "end_char": 20278, "estimated_token_count": 1857, "token_estimator": "heuristic-v1", "text": "## Origin Preservation\n\nIn previous versions of XCM, doing cross-chain transfers meant losing the origin. The XCM on the destination chain would have access to the transferred assets, but not to the origin. This means any instruction which uses assets but not the origin could be executed, that's enough to call [`DepositAsset`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Instruction.html#variant.DepositAsset){target=\\_blank} for example and complete the transfer, but not to call [`Transact`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Instruction.html#variant.Transact){target=\\_blank} and execute a call.\n\nIn XCMv5, [`InitiateTransfer`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Instruction.html#variant.InitiateTransfer){target=\\_blank} allows **preserving the origin**, enabling more use-cases such as executing a call on the destination chain via [`Transact`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Instruction.html#variant.Transact){target=\\_blank}.\nTo enable this feature, the [`preserve_origin`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Instruction.html#variant.InitiateTransfer.field.preserve_origin){target=\\_blank} parameter must be set to `true`.\n\n!!! note \"Why isn't preserving the origin the default?\"\n\n Preserving the origin requires a specific configuration on the underlying chain executing the XCM. Some chains have the right configuration, for example all system chains, but not every chain has it. If you make a transfer with `preserve_origin: true` to a chain configured incorrectly, the transfer will fail.\n\n However, if you set `preserve_origin: false` then there is no problem. Because of this, origin preservation is not the default, and likely never will be.\n\n??? code \"Teleport and Transact Example\"\n\n This example creates an XCM program that teleports DOT from Asset Hub to People and executes a call there. The whole script is almost the same as the one for a simple teleport above, most changes are in the `remoteXcm` variable.\n\n The setup for this script is [installing PAPI](/develop/toolkit/api-libraries/papi#get-started){target=\\_blank} and generating descriptors for both Asset Hub and People:\n `bun papi add ahp -n polkadot_asset_hub && bun papi add people -n polkadot_people`\n\n ```typescript title=\"teleport-and-transact.ts\"\n // `ahp` is the name given to `npx papi add`\n import {\n ahp,\n people,\n XcmV2OriginKind,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3MultiassetFungibility,\n XcmV5AssetFilter,\n XcmV5Instruction,\n XcmV5Junction,\n XcmV5Junctions,\n XcmV5WildAsset,\n XcmVersionedXcm,\n } from '@polkadot-api/descriptors';\n import { Binary, createClient, Enum, FixedSizeBinary } from 'polkadot-api';\n // import from \"polkadot-api/ws-provider/node\"\n // if running in a NodeJS environment\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import { sr25519CreateDerive } from '@polkadot-labs/hdkd';\n import {\n DEV_PHRASE,\n entropyToMiniSecret,\n mnemonicToEntropy,\n ss58Address,\n } from '@polkadot-labs/hdkd-helpers';\n import { getPolkadotSigner } from 'polkadot-api/signer';\n\n const entropy = mnemonicToEntropy(DEV_PHRASE);\n const miniSecret = entropyToMiniSecret(entropy);\n const derive = sr25519CreateDerive(miniSecret);\n const keyPair = derive('//Alice');\n\n const polkadotSigner = getPolkadotSigner(\n keyPair.publicKey,\n 'Sr25519',\n keyPair.sign\n );\n\n // Connect to Polkadot Asset Hub.\n // Pointing to localhost since this example uses chopsticks.\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('ws://localhost:8000'))\n );\n\n // Get the typed API, a typesafe API for interacting with the chain.\n const ahpApi = client.getTypedApi(ahp);\n\n const PEOPLE_PARA_ID = 1004;\n // The identifier for DOT is the location of the Polkadot Relay Chain,\n // which is 1 up relative to any parachain.\n const DOT = {\n parents: 1,\n interior: XcmV3Junctions.Here(),\n };\n // DOT has 10 decimals.\n const DOT_UNITS = 10_000_000_000n;\n\n // The DOT to withdraw for both fees and transfer.\n const dotToWithdraw = {\n id: DOT,\n fun: XcmV3MultiassetFungibility.Fungible(10n * DOT_UNITS),\n };\n // The DOT to use for local fee payment.\n const dotToPayFees = {\n id: DOT,\n fun: XcmV3MultiassetFungibility.Fungible(1n * DOT_UNITS),\n };\n // The location of the People Chain from Asset Hub.\n const destination = {\n parents: 1,\n interior: XcmV3Junctions.X1(XcmV3Junction.Parachain(PEOPLE_PARA_ID)),\n };\n // Pay for fees on the People Chain with teleported DOT.\n // This is specified independently of the transferred assets since they're used\n // exclusively for fees. Also because fees can be paid in a different\n // asset from the transferred assets.\n const remoteFees = Enum(\n 'Teleport',\n XcmV5AssetFilter.Definite([\n {\n id: DOT,\n fun: XcmV3MultiassetFungibility.Fungible(1n * DOT_UNITS),\n },\n ])\n );\n // No need to preserve origin for this example.\n const preserveOrigin = false;\n // The assets to transfer are whatever remains in the\n // holding register at the time of executing the `InitiateTransfer`\n // instruction. DOT in this case, teleported.\n const assets = [\n Enum('Teleport', XcmV5AssetFilter.Wild(XcmV5WildAsset.AllCounted(1))),\n ];\n // The beneficiary is the same account but on the People Chain.\n // This is a very common pattern for one public/private key pair\n // to hold assets on multiple chains.\n const beneficiary = FixedSizeBinary.fromBytes(keyPair.publicKey);\n // The call to be executed on the destination chain.\n // It's a simple remark with an event.\n // Create the call on Asset Hub since the system pallet is present in\n // every runtime, but if using any other pallet, connect to\n // the destination chain and create the call there.\n const remark = Binary.fromText('Hello, cross-chain!');\n const call = await ahpApi.tx.System.remark_with_event({\n remark,\n }).getEncodedData();\n // The XCM to be executed on the destination chain.\n // It's basically depositing everything to the beneficiary.\n const remoteXcm = [\n XcmV5Instruction.Transact({\n origin_kind: XcmV2OriginKind.SovereignAccount(),\n fallback_max_weight: undefined,\n call,\n }),\n XcmV5Instruction.RefundSurplus(),\n XcmV5Instruction.DepositAsset({\n assets: XcmV5AssetFilter.Wild(XcmV5WildAsset.AllCounted(1)),\n beneficiary: {\n parents: 0,\n interior: XcmV5Junctions.X1(\n XcmV5Junction.AccountId32({\n id: beneficiary,\n network: undefined,\n })\n ),\n },\n }),\n ];\n\n // The message assembles all the previously defined parameters.\n const xcm = XcmVersionedXcm.V5([\n XcmV5Instruction.WithdrawAsset([dotToWithdraw]),\n XcmV5Instruction.PayFees({ asset: dotToPayFees }),\n XcmV5Instruction.InitiateTransfer({\n destination,\n remote_fees: remoteFees,\n preserve_origin: preserveOrigin,\n assets,\n remote_xcm: remoteXcm,\n }),\n // Return any leftover fees from the fees register back to holding.\n XcmV5Instruction.RefundSurplus(),\n // Deposit remaining assets (refunded fees) to the originating account.\n // Using AllCounted(1) since only one asset type (DOT) remains - a minor optimization.\n XcmV5Instruction.DepositAsset({\n assets: XcmV5AssetFilter.Wild(XcmV5WildAsset.AllCounted(1)),\n beneficiary: {\n parents: 0,\n interior: XcmV5Junctions.X1(\n XcmV5Junction.AccountId32({\n id: beneficiary, // The originating account.\n network: undefined,\n })\n ),\n },\n }),\n ]);\n\n // The XCM weight is needed to set the `max_weight` parameter\n // on the actual `PolkadotXcm.execute()` call.\n const weightResult = await ahpApi.apis.XcmPaymentApi.query_xcm_weight(xcm);\n\n if (weightResult.success) {\n const weight = weightResult.success\n ? weightResult.value\n : { ref_time: 0n, proof_size: 0n };\n\n console.dir(weight);\n\n // The actual transaction to submit.\n // This tells Asset Hub to execute the XCM.\n const tx = ahpApi.tx.PolkadotXcm.execute({\n message: xcm,\n max_weight: weight,\n });\n\n // Sign and propagate to the network.\n const result = await tx.signAndSubmit(polkadotSigner);\n console.log(stringify(result));\n }\n\n client.destroy();\n\n // A helper function to print numbers inside of the result.\n function stringify(obj: any) {\n return JSON.stringify(\n obj,\n (_, v) => (typeof v === 'bigint' ? v.toString() : v),\n 2\n );\n }\n\n ```"} {"page_id": "develop-interoperability-xcm-guides-from-apps", "page_title": "From Apps", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 462, "end_char": 511, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "develop-interoperability-xcm-guides", "page_title": "XCM Guides", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 466, "end_char": 516, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-interoperability-xcm-guides", "page_title": "XCM Guides", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 516, "end_char": 1663, "estimated_token_count": 328, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-interoperability-xcm-guides", "page_title": "XCM Guides", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 516, "end_char": 1619, "estimated_token_count": 316, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 20, "end_char": 932, "estimated_token_count": 159, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nRuntime APIs allow node-side code to extract information from the runtime state. While simple storage access retrieves stored values directly, runtime APIs enable arbitrary computation, making them a powerful tool for interacting with the chain's state.\n\nUnlike direct storage access, runtime APIs can derive values from storage based on arguments or perform computations that don't require storage access. For example, a runtime API might expose a formula for fee calculation, using only the provided arguments as inputs rather than fetching data from storage.\n\nIn general, runtime APIs are used for:\n\n- Accessing a storage item.\n- Retrieving a bundle of related storage items.\n- Deriving a value from storage based on arguments.\n- Exposing formulas for complex computational calculations.\n\nThis section will teach you about specific runtime APIs that support XCM processing and manipulation."} {"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 1, "depth": 2, "title": "Dry Run API", "anchor": "dry-run-api", "start_char": 932, "end_char": 1492, "estimated_token_count": 140, "token_estimator": "heuristic-v1", "text": "## Dry Run API\n\nThe [Dry-run API](https://paritytech.github.io/polkadot-sdk/master/xcm_runtime_apis/dry_run/trait.DryRunApi.html){target=\\_blank}, given an extrinsic, or an XCM program, returns its effects:\n\n- Execution result\n- Local XCM (in the case of an extrinsic)\n- Forwarded XCMs\n- List of events\n\nThis API can be used independently for dry-running, double-checking, or testing. However, it mainly shines when used with the [Xcm Payment API](#xcm-payment-api), given that it only estimates fees if you know the specific XCM you want to execute or send."} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 2, "depth": 3, "title": "Dry Run Call", "anchor": "dry-run-call", "start_char": 1492, "end_char": 10429, "estimated_token_count": 1647, "token_estimator": "heuristic-v1", "text": "### Dry Run Call\n\nThis API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains.\n\n```rust\nfn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>;\n```\n\n??? interface \"Input parameters\"\n\n `origin` ++\"OriginCaller\"++ ++\"required\"++\n \n The origin used for executing the transaction.\n\n ---\n\n `call` ++\"Call\"++ ++\"required\"++\n\n The extrinsic to be executed.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n Effects of dry-running an extrinsic. If an error occurs, it is returned instead of the effects.\n\n ??? child \"Type `CallDryRunEffects`\"\n\n `execution_result` ++\"DispatchResultWithPostInfo\"++\n\n The result of executing the extrinsic.\n\n ---\n\n `emitted_events` ++\"Vec\"++\n\n The list of events fired by the extrinsic.\n\n ---\n\n `local_xcm` ++\"Option>\"++\n\n The local XCM that was attempted to be executed, if any.\n\n ---\n\n `forwarded_xcms` ++\"Vec<(VersionedLocation, Vec>)>\"++\n\n The list of XCMs that were queued for sending.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n\n??? interface \"Example\"\n\n This example demonstrates how to simulate a cross-chain asset transfer from the Paseo network to the Pop Network using a [reserve transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#reserve-asset-transfer){target=\\_blank} mechanism. Instead of executing the actual transfer, the code shows how to test and verify the transaction's behavior through a dry run before performing it on the live network.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { paseo } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n PolkadotRuntimeOriginCaller,\n XcmVersionedLocation,\n XcmVersionedAssets,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n } from '@polkadot-api/descriptors';\n import { DispatchRawOrigin } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to the Paseo relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')),\n );\n\n const paseoApi = client.getTypedApi(paseo);\n\n const popParaID = 4001;\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the origin caller\n // This is a regular signed account owned by a user\n let origin = PolkadotRuntimeOriginCaller.system(\n DispatchRawOrigin.Signed(userAddress),\n );\n\n // Define a transaction to transfer assets from Polkadot to Pop Network using a Reserve Transfer\n const tx = paseoApi.tx.XcmPallet.limited_reserve_transfer_assets({\n dest: XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.Parachain(popParaID), // Destination is the Pop Network parachain\n ),\n }),\n beneficiary: XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n // Beneficiary address on Pop Network\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n }),\n assets: XcmVersionedAssets.V3([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 0,\n interior: XcmV3Junctions.Here(), // Native asset from the sender. In this case PAS\n }),\n fun: XcmV3MultiassetFungibility.Fungible(120000000000n), // Asset amount to transfer\n },\n ]),\n fee_asset_item: 0, // Asset used to pay transaction fees\n weight_limit: XcmV3WeightLimit.Unlimited(), // No weight limit on transaction\n });\n\n // Execute the dry run call to simulate the transaction\n const dryRunResult = await paseoApi.apis.DryRunApi.dry_run_call(\n origin,\n tx.decodedCall,\n );\n\n // Extract the data from the dry run result\n const {\n execution_result: executionResult,\n emitted_events: emmittedEvents,\n local_xcm: localXcm,\n forwarded_xcms: forwardedXcms,\n } = dryRunResult.value;\n\n // Extract the XCM generated by this call\n const xcmsToPop = forwardedXcms.find(\n ([location, _]) =>\n location.type === 'V4' &&\n location.value.parents === 0 &&\n location.value.interior.type === 'X1' &&\n location.value.interior.value.type === 'Parachain' &&\n location.value.interior.value.value === popParaID, // Pop network's ParaID\n );\n const destination = xcmsToPop[0];\n const remoteXcm = xcmsToPop[1][0];\n\n // Print the results\n const resultObject = {\n execution_result: executionResult,\n emitted_events: emmittedEvents,\n local_xcm: localXcm,\n destination: destination,\n remote_xcm: remoteXcm,\n };\n\n console.dir(resultObject, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          execution_result: {\n            success: true,\n            value: {\n              actual_weight: undefined,\n              pays_fee: { type: 'Yes', value: undefined }\n            }\n          },\n          emitted_events: [\n                ...\n          ],\n          local_xcm: undefined,\n          destination: {\n            type: 'V4',\n            value: {\n              parents: 0,\n              interior: { type: 'X1', value: { type: 'Parachain', value: 4001 } }\n            }\n          },\n          remote_xcm: {\n            type: 'V3',\n            value: [\n              {\n                type: 'ReserveAssetDeposited',\n                value: [\n                  {\n                    id: {\n                      type: 'Concrete',\n                      value: {\n                        parents: 1,\n                        interior: { type: 'Here', value: undefined }\n                      }\n                    },\n                    fun: { type: 'Fungible', value: 120000000000n }\n                  }\n                ]\n              },\n              { type: 'ClearOrigin', value: undefined },\n              {\n                type: 'BuyExecution',\n                value: {\n                  fees: {\n                    id: {\n                      type: 'Concrete',\n                      value: {\n                        parents: 1,\n                        interior: { type: 'Here', value: undefined }\n                      }\n                    },\n                    fun: { type: 'Fungible', value: 120000000000n }\n                  },\n                  weight_limit: { type: 'Unlimited', value: undefined }\n                }\n              },\n              {\n                type: 'DepositAsset',\n                value: {\n                  assets: { type: 'Wild', value: { type: 'AllCounted', value: 1 } },\n                  beneficiary: {\n                    parents: 0,\n                    interior: {\n                      type: 'X1',\n                      value: {\n                        type: 'AccountId32',\n                        value: {\n                          network: undefined,\n                          id: FixedSizeBinary {\n                            asText: [Function (anonymous)],\n                            asHex: [Function (anonymous)],\n                            asOpaqueHex: [Function (anonymous)],\n                            asBytes: [Function (anonymous)],\n                            asOpaqueBytes: [Function (anonymous)]\n                          }\n                        }\n                      }\n                    }\n                  }\n                }\n              },\n              {\n                type: 'SetTopic',\n                value: FixedSizeBinary {\n                  asText: [Function (anonymous)],\n                  asHex: [Function (anonymous)],\n                  asOpaqueHex: [Function (anonymous)],\n                  asBytes: [Function (anonymous)],\n                  asOpaqueBytes: [Function (anonymous)]\n                }\n              }\n            ]\n          }\n        }      \n      
\n
\n\n ---"} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 3, "depth": 3, "title": "Dry Run XCM", "anchor": "dry-run-xcm", "start_char": 10429, "end_char": 16835, "estimated_token_count": 1176, "token_estimator": "heuristic-v1", "text": "### Dry Run XCM\n\nThis API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains.\n\n```rust\nfn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>;\n```\n\n??? interface \"Input parameters\"\n\n `origin_location` ++\"VersionedLocation\"++ ++\"required\"++\n\n The location of the origin that will execute the xcm message.\n\n ---\n\n `xcm` ++\"VersionedXcm\"++ ++\"required\"++\n\n A versioned XCM message.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n Effects of dry-running an extrinsic. If an error occurs, it is returned instead of the effects.\n\n ??? child \"Type `XcmDryRunEffects`\"\n\n `execution_result` ++\"DispatchResultWithPostInfo\"++\n\n The result of executing the extrinsic.\n\n ---\n\n `emitted_events` ++\"Vec\"++\n\n The list of events fired by the extrinsic.\n\n ---\n\n `forwarded_xcms` ++\"Vec<(VersionedLocation, Vec>)>\"++\n\n The list of XCMs that were queued for sending.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to simulate a [teleport asset transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#asset-teleportation){target=\\_blank} from the Paseo network to the Paseo Asset Hub parachain. The code shows how to test and verify the received XCM message's behavior in the destination chain through a dry run on the live network.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseoAssetHub,\n XcmVersionedLocation,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to Paseo Asset Hub\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the origin\n const origin = XcmVersionedLocation.V3({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n });\n\n // Define a xcm message comming from the Paseo relay chain to Asset Hub to Teleport some tokens\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute dry run xcm\n const dryRunResult = await paseoAssetHubApi.apis.DryRunApi.dry_run_xcm(\n origin,\n xcm,\n );\n\n // Print the results\n console.dir(dryRunResult.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          execution_result: {\n            type: 'Complete',\n            value: { used: { ref_time: 15574200000n, proof_size: 359300n } }\n          },\n          emitted_events: [\n            {\n              type: 'System',\n              value: {\n                type: 'NewAccount',\n                value: { account: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET' }\n              }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Endowed',\n                value: {\n                  account: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',\n                  free_balance: 10203500000n\n                }\n              }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Minted',\n                value: {\n                  who: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',\n                  amount: 10203500000n\n                }\n              }\n            },\n            {\n              type: 'Balances',\n              value: { type: 'Issued', value: { amount: 1796500000n } }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Deposit',\n                value: {\n                  who: '13UVJyLgBASGhE2ok3TvxUfaQBGUt88JCcdYjHvUhvQkFTTx',\n                  amount: 1796500000n\n                }\n              }\n            }\n          ],\n          forwarded_xcms: [\n            [\n              {\n                type: 'V4',\n                value: { parents: 1, interior: { type: 'Here', value: undefined } }\n              },\n              []\n            ]\n          ]\n        }\n      
\n
\n\n ---"} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 4, "depth": 2, "title": "XCM Payment API", "anchor": "xcm-payment-api", "start_char": 16835, "end_char": 17921, "estimated_token_count": 222, "token_estimator": "heuristic-v1", "text": "## XCM Payment API\n\nThe [XCM Payment API](https://paritytech.github.io/polkadot-sdk/master/xcm_runtime_apis/fees/trait.XcmPaymentApi.html){target=\\_blank} provides a standardized way to determine the costs and payment options for executing XCM messages. Specifically, it enables clients to:\n\n- Retrieve the [weight](/polkadot-protocol/glossary/#weight) required to execute an XCM message.\n- Obtain a list of acceptable `AssetIds` for paying execution fees.\n- Calculate the cost of the weight in a specified `AssetId`.\n- Estimate the fees for XCM message delivery.\n\nThis API eliminates the need for clients to guess execution fees or identify acceptable assets manually. Instead, clients can query the list of supported asset IDs formatted according to the XCM version they understand. With this information, they can weigh the XCM program they intend to execute and convert the computed weight into its cost using one of the acceptable assets.\n\nTo use the API effectively, the client must already know the XCM program to be executed and the chains involved in the program's execution."} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 5, "depth": 3, "title": "Query Acceptable Payment Assets", "anchor": "query-acceptable-payment-assets", "start_char": 17921, "end_char": 20567, "estimated_token_count": 582, "token_estimator": "heuristic-v1", "text": "### Query Acceptable Payment Assets\n\nRetrieves the list of assets that are acceptable for paying fees when using a specific XCM version\n\n```rust\nfn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>;\n```\n\n??? interface \"Input parameters\"\n\n `xcm_version` ++\"Version\"++ ++\"required\"++\n\n Specifies the XCM version that will be used to send the XCM message.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n A list of acceptable payment assets. Each asset is provided in a versioned format (`VersionedAssetId`) that matches the specified XCM version. If an error occurs, it is returned instead of the asset list.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to query the acceptable payment assets for executing XCM messages on the Paseo Asset Hub network using XCM version 3.\n\n ***Usage with PAPI***\n\n ```js\n import { paseoAssetHub } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n\n // Connect to the polkadot relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n // Define the xcm version to use\n const xcmVersion = 3;\n\n // Execute the runtime call to query the assets\n const result =\n await paseoAssetHubApi.apis.XcmPaymentApi.query_acceptable_payment_assets(\n xcmVersion,\n );\n\n // Print the assets\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        [\n          {\n            type: 'V3',\n            value: {\n              type: 'Concrete',\n              value: { parents: 1, interior: { type: 'Here', value: undefined } }\n            }\n          }\n        ]\n      
\n
\n\n ---"} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 6, "depth": 3, "title": "Query XCM Weight", "anchor": "query-xcm-weight", "start_char": 20567, "end_char": 25168, "estimated_token_count": 922, "token_estimator": "heuristic-v1", "text": "### Query XCM Weight\n\nCalculates the weight required to execute a given XCM message. It is useful for estimating the execution cost of a cross-chain message in the destination chain before sending it.\n\n```rust\nfn query_xcm_weight(message: VersionedXcm<()>) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `message` ++\"VersionedXcm<()>\"++ ++\"required\"++\n \n A versioned XCM message whose execution weight is being queried.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The calculated weight required to execute the provided XCM message. If the calculation fails, an error is returned instead.\n\n ??? child \"Type `Weight`\"\n\n `ref_time` ++\"u64\"++\n\n The weight of computational time used based on some reference hardware.\n\n ---\n\n `proof_size` ++\"u64\"++\n\n The weight of storage space used by proof of validity.\n\n ---\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to calculate the weight needed to execute a [teleport transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#asset-teleportation){target=\\_blank} from the Paseo network to the Paseo Asset Hub parachain using the XCM Payment API. The result shows the required weight in terms of reference time and proof size needed in the destination chain.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseoAssetHub,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to Paseo Asset Hub\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define a xcm message comming from the Paseo relay chain to Asset Hub to Teleport some tokens\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute the query weight runtime call\n const result = await paseoAssetHubApi.apis.XcmPaymentApi.query_xcm_weight(xcm);\n\n // Print the results\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n { ref_time: 15574200000n, proof_size: 359300n }\n
\n\n ---"} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 7, "depth": 3, "title": "Query Weight to Asset Fee", "anchor": "query-weight-to-asset-fee", "start_char": 25168, "end_char": 28210, "estimated_token_count": 699, "token_estimator": "heuristic-v1", "text": "### Query Weight to Asset Fee\n\nConverts a given weight into the corresponding fee for a specified `AssetId`. It allows clients to determine the cost of execution in terms of the desired asset.\n\n```rust\nfn query_weight_to_asset_fee(weight: Weight, asset: VersionedAssetId) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `weight` ++\"Weight\"++ ++\"required\"++\n \n The execution weight to be converted into a fee.\n\n ??? child \"Type `Weight`\"\n\n `ref_time` ++\"u64\"++\n\n The weight of computational time used based on some reference hardware.\n\n ---\n\n `proof_size` ++\"u64\"++\n\n The weight of storage space used by proof of validity.\n\n ---\n\n ---\n\n `asset` ++\"VersionedAssetId\"++ ++\"required\"++\n \n The asset in which the fee will be calculated. This must be a versioned asset ID compatible with the runtime.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The fee needed to pay for the execution for the given `AssetId.`\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to calculate the fee for a given execution weight using a specific versioned asset ID (PAS token) on Paseo Asset Hub.\n\n ***Usage with PAPI***\n\n ```js\n import { paseoAssetHub } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n\n // Connect to the polkadot relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n // Define the weight to convert to fee\n const weight = { ref_time: 15574200000n, proof_size: 359300n };\n\n // Define the versioned asset id\n const versionedAssetId = {\n type: 'V4',\n value: { parents: 1, interior: { type: 'Here', value: undefined } },\n };\n\n // Execute the runtime call to convert the weight to fee\n const result =\n await paseoAssetHubApi.apis.XcmPaymentApi.query_weight_to_asset_fee(\n weight,\n versionedAssetId,\n );\n\n // Print the fee\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n 1796500000n\n
\n\n ---"} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 8, "depth": 3, "title": "Query Delivery Fees", "anchor": "query-delivery-fees", "start_char": 28210, "end_char": 33184, "estimated_token_count": 965, "token_estimator": "heuristic-v1", "text": "### Query Delivery Fees\n\nRetrieves the delivery fees for sending a specific XCM message to a designated destination. The fees are always returned in a specific asset defined by the destination chain.\n\n```rust\nfn query_delivery_fees(destination: VersionedLocation, message: VersionedXcm<()>) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `destination` ++\"VersionedLocation\"++ ++\"required\"++\n \n The target location where the message will be sent. Fees may vary depending on the destination, as different destinations often have unique fee structures and sender mechanisms.\n\n ---\n\n `message` ++\"VersionedXcm<()>\"++ ++\"required\"++\n \n The XCM message to be sent. The delivery fees are calculated based on the message's content and size, which can influence the cost.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The calculated delivery fees expressed in a specific asset supported by the destination chain. If an error occurs during the query, it returns an error instead.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to query the delivery fees for sending an XCM message from Paseo to Paseo Asset Hub.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseo,\n XcmVersionedLocation,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')),\n );\n\n const paseoApi = client.getTypedApi(paseo);\n\n const paseoAssetHubParaID = 1000;\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the destination\n const destination = XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(XcmV3Junction.Parachain(paseoAssetHubParaID)),\n });\n\n // Define the xcm message that will be sent to the destination\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute the query delivery fees runtime call\n const result = await paseoApi.apis.XcmPaymentApi.query_delivery_fees(\n destination,\n xcm,\n );\n\n // Print the results\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          type: 'V3',\n          value: [\n            {\n              id: {\n                type: 'Concrete',\n                value: { parents: 0, interior: { type: 'Here', value: undefined } }\n              },\n              fun: { type: 'Fungible', value: 396000000n }\n            }\n          ]\n        }\n      
\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 2, "depth": 3, "title": "Dry Run Call", "anchor": "dry-run-call", "start_char": 1492, "end_char": 10303, "estimated_token_count": 1620, "token_estimator": "heuristic-v1", "text": "### Dry Run Call\n\nThis API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains.\n\n```rust\n\n```\n\n??? interface \"Input parameters\"\n\n `origin` ++\"OriginCaller\"++ ++\"required\"++\n \n The origin used for executing the transaction.\n\n ---\n\n `call` ++\"Call\"++ ++\"required\"++\n\n The extrinsic to be executed.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n Effects of dry-running an extrinsic. If an error occurs, it is returned instead of the effects.\n\n ??? child \"Type `CallDryRunEffects`\"\n\n `execution_result` ++\"DispatchResultWithPostInfo\"++\n\n The result of executing the extrinsic.\n\n ---\n\n `emitted_events` ++\"Vec\"++\n\n The list of events fired by the extrinsic.\n\n ---\n\n `local_xcm` ++\"Option>\"++\n\n The local XCM that was attempted to be executed, if any.\n\n ---\n\n `forwarded_xcms` ++\"Vec<(VersionedLocation, Vec>)>\"++\n\n The list of XCMs that were queued for sending.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n\n??? interface \"Example\"\n\n This example demonstrates how to simulate a cross-chain asset transfer from the Paseo network to the Pop Network using a [reserve transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#reserve-asset-transfer){target=\\_blank} mechanism. Instead of executing the actual transfer, the code shows how to test and verify the transaction's behavior through a dry run before performing it on the live network.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { paseo } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n PolkadotRuntimeOriginCaller,\n XcmVersionedLocation,\n XcmVersionedAssets,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n } from '@polkadot-api/descriptors';\n import { DispatchRawOrigin } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to the Paseo relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')),\n );\n\n const paseoApi = client.getTypedApi(paseo);\n\n const popParaID = 4001;\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the origin caller\n // This is a regular signed account owned by a user\n let origin = PolkadotRuntimeOriginCaller.system(\n DispatchRawOrigin.Signed(userAddress),\n );\n\n // Define a transaction to transfer assets from Polkadot to Pop Network using a Reserve Transfer\n const tx = paseoApi.tx.XcmPallet.limited_reserve_transfer_assets({\n dest: XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.Parachain(popParaID), // Destination is the Pop Network parachain\n ),\n }),\n beneficiary: XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n // Beneficiary address on Pop Network\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n }),\n assets: XcmVersionedAssets.V3([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 0,\n interior: XcmV3Junctions.Here(), // Native asset from the sender. In this case PAS\n }),\n fun: XcmV3MultiassetFungibility.Fungible(120000000000n), // Asset amount to transfer\n },\n ]),\n fee_asset_item: 0, // Asset used to pay transaction fees\n weight_limit: XcmV3WeightLimit.Unlimited(), // No weight limit on transaction\n });\n\n // Execute the dry run call to simulate the transaction\n const dryRunResult = await paseoApi.apis.DryRunApi.dry_run_call(\n origin,\n tx.decodedCall,\n );\n\n // Extract the data from the dry run result\n const {\n execution_result: executionResult,\n emitted_events: emmittedEvents,\n local_xcm: localXcm,\n forwarded_xcms: forwardedXcms,\n } = dryRunResult.value;\n\n // Extract the XCM generated by this call\n const xcmsToPop = forwardedXcms.find(\n ([location, _]) =>\n location.type === 'V4' &&\n location.value.parents === 0 &&\n location.value.interior.type === 'X1' &&\n location.value.interior.value.type === 'Parachain' &&\n location.value.interior.value.value === popParaID, // Pop network's ParaID\n );\n const destination = xcmsToPop[0];\n const remoteXcm = xcmsToPop[1][0];\n\n // Print the results\n const resultObject = {\n execution_result: executionResult,\n emitted_events: emmittedEvents,\n local_xcm: localXcm,\n destination: destination,\n remote_xcm: remoteXcm,\n };\n\n console.dir(resultObject, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          execution_result: {\n            success: true,\n            value: {\n              actual_weight: undefined,\n              pays_fee: { type: 'Yes', value: undefined }\n            }\n          },\n          emitted_events: [\n                ...\n          ],\n          local_xcm: undefined,\n          destination: {\n            type: 'V4',\n            value: {\n              parents: 0,\n              interior: { type: 'X1', value: { type: 'Parachain', value: 4001 } }\n            }\n          },\n          remote_xcm: {\n            type: 'V3',\n            value: [\n              {\n                type: 'ReserveAssetDeposited',\n                value: [\n                  {\n                    id: {\n                      type: 'Concrete',\n                      value: {\n                        parents: 1,\n                        interior: { type: 'Here', value: undefined }\n                      }\n                    },\n                    fun: { type: 'Fungible', value: 120000000000n }\n                  }\n                ]\n              },\n              { type: 'ClearOrigin', value: undefined },\n              {\n                type: 'BuyExecution',\n                value: {\n                  fees: {\n                    id: {\n                      type: 'Concrete',\n                      value: {\n                        parents: 1,\n                        interior: { type: 'Here', value: undefined }\n                      }\n                    },\n                    fun: { type: 'Fungible', value: 120000000000n }\n                  },\n                  weight_limit: { type: 'Unlimited', value: undefined }\n                }\n              },\n              {\n                type: 'DepositAsset',\n                value: {\n                  assets: { type: 'Wild', value: { type: 'AllCounted', value: 1 } },\n                  beneficiary: {\n                    parents: 0,\n                    interior: {\n                      type: 'X1',\n                      value: {\n                        type: 'AccountId32',\n                        value: {\n                          network: undefined,\n                          id: FixedSizeBinary {\n                            asText: [Function (anonymous)],\n                            asHex: [Function (anonymous)],\n                            asOpaqueHex: [Function (anonymous)],\n                            asBytes: [Function (anonymous)],\n                            asOpaqueBytes: [Function (anonymous)]\n                          }\n                        }\n                      }\n                    }\n                  }\n                }\n              },\n              {\n                type: 'SetTopic',\n                value: FixedSizeBinary {\n                  asText: [Function (anonymous)],\n                  asHex: [Function (anonymous)],\n                  asOpaqueHex: [Function (anonymous)],\n                  asBytes: [Function (anonymous)],\n                  asOpaqueBytes: [Function (anonymous)]\n                }\n              }\n            ]\n          }\n        }      \n      
\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 3, "depth": 3, "title": "Dry Run XCM", "anchor": "dry-run-xcm", "start_char": 10303, "end_char": 16591, "estimated_token_count": 1150, "token_estimator": "heuristic-v1", "text": "### Dry Run XCM\n\nThis API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains.\n\n```rust\n\n```\n\n??? interface \"Input parameters\"\n\n `origin_location` ++\"VersionedLocation\"++ ++\"required\"++\n\n The location of the origin that will execute the xcm message.\n\n ---\n\n `xcm` ++\"VersionedXcm\"++ ++\"required\"++\n\n A versioned XCM message.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n Effects of dry-running an extrinsic. If an error occurs, it is returned instead of the effects.\n\n ??? child \"Type `XcmDryRunEffects`\"\n\n `execution_result` ++\"DispatchResultWithPostInfo\"++\n\n The result of executing the extrinsic.\n\n ---\n\n `emitted_events` ++\"Vec\"++\n\n The list of events fired by the extrinsic.\n\n ---\n\n `forwarded_xcms` ++\"Vec<(VersionedLocation, Vec>)>\"++\n\n The list of XCMs that were queued for sending.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to simulate a [teleport asset transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#asset-teleportation){target=\\_blank} from the Paseo network to the Paseo Asset Hub parachain. The code shows how to test and verify the received XCM message's behavior in the destination chain through a dry run on the live network.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseoAssetHub,\n XcmVersionedLocation,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to Paseo Asset Hub\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the origin\n const origin = XcmVersionedLocation.V3({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n });\n\n // Define a xcm message comming from the Paseo relay chain to Asset Hub to Teleport some tokens\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute dry run xcm\n const dryRunResult = await paseoAssetHubApi.apis.DryRunApi.dry_run_xcm(\n origin,\n xcm,\n );\n\n // Print the results\n console.dir(dryRunResult.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          execution_result: {\n            type: 'Complete',\n            value: { used: { ref_time: 15574200000n, proof_size: 359300n } }\n          },\n          emitted_events: [\n            {\n              type: 'System',\n              value: {\n                type: 'NewAccount',\n                value: { account: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET' }\n              }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Endowed',\n                value: {\n                  account: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',\n                  free_balance: 10203500000n\n                }\n              }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Minted',\n                value: {\n                  who: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',\n                  amount: 10203500000n\n                }\n              }\n            },\n            {\n              type: 'Balances',\n              value: { type: 'Issued', value: { amount: 1796500000n } }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Deposit',\n                value: {\n                  who: '13UVJyLgBASGhE2ok3TvxUfaQBGUt88JCcdYjHvUhvQkFTTx',\n                  amount: 1796500000n\n                }\n              }\n            }\n          ],\n          forwarded_xcms: [\n            [\n              {\n                type: 'V4',\n                value: { parents: 1, interior: { type: 'Here', value: undefined } }\n              },\n              []\n            ]\n          ]\n        }\n      
\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 4, "depth": 2, "title": "XCM Payment API", "anchor": "xcm-payment-api", "start_char": 16591, "end_char": 17677, "estimated_token_count": 222, "token_estimator": "heuristic-v1", "text": "## XCM Payment API\n\nThe [XCM Payment API](https://paritytech.github.io/polkadot-sdk/master/xcm_runtime_apis/fees/trait.XcmPaymentApi.html){target=\\_blank} provides a standardized way to determine the costs and payment options for executing XCM messages. Specifically, it enables clients to:\n\n- Retrieve the [weight](/polkadot-protocol/glossary/#weight) required to execute an XCM message.\n- Obtain a list of acceptable `AssetIds` for paying execution fees.\n- Calculate the cost of the weight in a specified `AssetId`.\n- Estimate the fees for XCM message delivery.\n\nThis API eliminates the need for clients to guess execution fees or identify acceptable assets manually. Instead, clients can query the list of supported asset IDs formatted according to the XCM version they understand. With this information, they can weigh the XCM program they intend to execute and convert the computed weight into its cost using one of the acceptable assets.\n\nTo use the API effectively, the client must already know the XCM program to be executed and the chains involved in the program's execution."} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 5, "depth": 3, "title": "Query Acceptable Payment Assets", "anchor": "query-acceptable-payment-assets", "start_char": 17677, "end_char": 20226, "estimated_token_count": 563, "token_estimator": "heuristic-v1", "text": "### Query Acceptable Payment Assets\n\nRetrieves the list of assets that are acceptable for paying fees when using a specific XCM version\n\n```rust\n\n```\n\n??? interface \"Input parameters\"\n\n `xcm_version` ++\"Version\"++ ++\"required\"++\n\n Specifies the XCM version that will be used to send the XCM message.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n A list of acceptable payment assets. Each asset is provided in a versioned format (`VersionedAssetId`) that matches the specified XCM version. If an error occurs, it is returned instead of the asset list.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to query the acceptable payment assets for executing XCM messages on the Paseo Asset Hub network using XCM version 3.\n\n ***Usage with PAPI***\n\n ```js\n import { paseoAssetHub } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n\n // Connect to the polkadot relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n // Define the xcm version to use\n const xcmVersion = 3;\n\n // Execute the runtime call to query the assets\n const result =\n await paseoAssetHubApi.apis.XcmPaymentApi.query_acceptable_payment_assets(\n xcmVersion,\n );\n\n // Print the assets\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        [\n          {\n            type: 'V3',\n            value: {\n              type: 'Concrete',\n              value: { parents: 1, interior: { type: 'Here', value: undefined } }\n            }\n          }\n        ]\n      
\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 6, "depth": 3, "title": "Query XCM Weight", "anchor": "query-xcm-weight", "start_char": 20226, "end_char": 24827, "estimated_token_count": 922, "token_estimator": "heuristic-v1", "text": "### Query XCM Weight\n\nCalculates the weight required to execute a given XCM message. It is useful for estimating the execution cost of a cross-chain message in the destination chain before sending it.\n\n```rust\nfn query_xcm_weight(message: VersionedXcm<()>) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `message` ++\"VersionedXcm<()>\"++ ++\"required\"++\n \n A versioned XCM message whose execution weight is being queried.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The calculated weight required to execute the provided XCM message. If the calculation fails, an error is returned instead.\n\n ??? child \"Type `Weight`\"\n\n `ref_time` ++\"u64\"++\n\n The weight of computational time used based on some reference hardware.\n\n ---\n\n `proof_size` ++\"u64\"++\n\n The weight of storage space used by proof of validity.\n\n ---\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to calculate the weight needed to execute a [teleport transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#asset-teleportation){target=\\_blank} from the Paseo network to the Paseo Asset Hub parachain using the XCM Payment API. The result shows the required weight in terms of reference time and proof size needed in the destination chain.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseoAssetHub,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to Paseo Asset Hub\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define a xcm message comming from the Paseo relay chain to Asset Hub to Teleport some tokens\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute the query weight runtime call\n const result = await paseoAssetHubApi.apis.XcmPaymentApi.query_xcm_weight(xcm);\n\n // Print the results\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n { ref_time: 15574200000n, proof_size: 359300n }\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 7, "depth": 3, "title": "Query Weight to Asset Fee", "anchor": "query-weight-to-asset-fee", "start_char": 24827, "end_char": 27869, "estimated_token_count": 699, "token_estimator": "heuristic-v1", "text": "### Query Weight to Asset Fee\n\nConverts a given weight into the corresponding fee for a specified `AssetId`. It allows clients to determine the cost of execution in terms of the desired asset.\n\n```rust\nfn query_weight_to_asset_fee(weight: Weight, asset: VersionedAssetId) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `weight` ++\"Weight\"++ ++\"required\"++\n \n The execution weight to be converted into a fee.\n\n ??? child \"Type `Weight`\"\n\n `ref_time` ++\"u64\"++\n\n The weight of computational time used based on some reference hardware.\n\n ---\n\n `proof_size` ++\"u64\"++\n\n The weight of storage space used by proof of validity.\n\n ---\n\n ---\n\n `asset` ++\"VersionedAssetId\"++ ++\"required\"++\n \n The asset in which the fee will be calculated. This must be a versioned asset ID compatible with the runtime.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The fee needed to pay for the execution for the given `AssetId.`\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to calculate the fee for a given execution weight using a specific versioned asset ID (PAS token) on Paseo Asset Hub.\n\n ***Usage with PAPI***\n\n ```js\n import { paseoAssetHub } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n\n // Connect to the polkadot relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n // Define the weight to convert to fee\n const weight = { ref_time: 15574200000n, proof_size: 359300n };\n\n // Define the versioned asset id\n const versionedAssetId = {\n type: 'V4',\n value: { parents: 1, interior: { type: 'Here', value: undefined } },\n };\n\n // Execute the runtime call to convert the weight to fee\n const result =\n await paseoAssetHubApi.apis.XcmPaymentApi.query_weight_to_asset_fee(\n weight,\n versionedAssetId,\n );\n\n // Print the fee\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n 1796500000n\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 8, "depth": 3, "title": "Query Delivery Fees", "anchor": "query-delivery-fees", "start_char": 27869, "end_char": 32843, "estimated_token_count": 965, "token_estimator": "heuristic-v1", "text": "### Query Delivery Fees\n\nRetrieves the delivery fees for sending a specific XCM message to a designated destination. The fees are always returned in a specific asset defined by the destination chain.\n\n```rust\nfn query_delivery_fees(destination: VersionedLocation, message: VersionedXcm<()>) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `destination` ++\"VersionedLocation\"++ ++\"required\"++\n \n The target location where the message will be sent. Fees may vary depending on the destination, as different destinations often have unique fee structures and sender mechanisms.\n\n ---\n\n `message` ++\"VersionedXcm<()>\"++ ++\"required\"++\n \n The XCM message to be sent. The delivery fees are calculated based on the message's content and size, which can influence the cost.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The calculated delivery fees expressed in a specific asset supported by the destination chain. If an error occurs during the query, it returns an error instead.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to query the delivery fees for sending an XCM message from Paseo to Paseo Asset Hub.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseo,\n XcmVersionedLocation,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')),\n );\n\n const paseoApi = client.getTypedApi(paseo);\n\n const paseoAssetHubParaID = 1000;\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the destination\n const destination = XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(XcmV3Junction.Parachain(paseoAssetHubParaID)),\n });\n\n // Define the xcm message that will be sent to the destination\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute the query delivery fees runtime call\n const result = await paseoApi.apis.XcmPaymentApi.query_delivery_fees(\n destination,\n xcm,\n );\n\n // Print the results\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          type: 'V3',\n          value: [\n            {\n              id: {\n                type: 'Concrete',\n                value: { parents: 0, interior: { type: 'Here', value: undefined } }\n              },\n              fun: { type: 'Fungible', value: 396000000n }\n            }\n          ]\n        }\n      
\n
\n\n ---"} {"page_id": "develop-interoperability", "page_title": "Interoperability", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 872, "end_char": 922, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-interoperability", "page_title": "Interoperability", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 922, "end_char": 2375, "estimated_token_count": 390, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-interoperability", "page_title": "Interoperability", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 922, "end_char": 2331, "estimated_token_count": 378, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-networks", "page_title": "Networks", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 12, "end_char": 513, "estimated_token_count": 77, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nThe Polkadot ecosystem consists of multiple networks designed to support different stages of blockchain development, from main networks to test networks. Each network serves a unique purpose, providing developers with flexible environments for building, testing, and deploying blockchain applications.\n\nThis section includes essential network information such as RPC endpoints, currency symbols and decimals, and how to acquire TestNet tokens for the Polkadot ecosystem of networks."} {"page_id": "develop-networks", "page_title": "Networks", "index": 1, "depth": 2, "title": "Production Networks", "anchor": "production-networks", "start_char": 513, "end_char": 537, "estimated_token_count": 4, "token_estimator": "heuristic-v1", "text": "## Production Networks"} {"page_id": "develop-networks", "page_title": "Networks", "index": 2, "depth": 3, "title": "Polkadot", "anchor": "polkadot", "start_char": 537, "end_char": 1992, "estimated_token_count": 382, "token_estimator": "heuristic-v1", "text": "### Polkadot\n\nPolkadot is the primary production blockchain network for high-stakes, enterprise-grade applications. Polkadot MainNet has been running since May 2020 and has implementations in various programming languages ranging from Rust to JavaScript.\n\n=== \"Network Details\"\n\n **Currency symbol**: `DOT`\n\n ---\n \n **Currency decimals**: 10\n\n ---\n\n **Block explorer**: [Polkadot Subscan](https://polkadot.subscan.io/){target=\\_blank}\n\n=== \"RPC Endpoints\"\n\n Blockops\n\n ```\n wss://polkadot-public-rpc.blockops.network/ws\n ```\n\n ---\n\n Dwellir\n\n ```\n wss://polkadot-rpc.dwellir.com\n ```\n\n ---\n\n Dwellir Tunisia\n\n ```\n wss://polkadot-rpc-tn.dwellir.com\n ```\n\n ---\n\n IBP1\n\n ```\n wss://rpc.ibp.network/polkadot\n ```\n\n ---\n\n IBP2\n\n ```\n wss://polkadot.dotters.network\n ```\n\n ---\n\n LuckyFriday\n\n ```\n wss://rpc-polkadot.luckyfriday.io\n ```\n\n ---\n\n OnFinality\n\n ```\n wss://polkadot.api.onfinality.io/public-ws\n ```\n\n ---\n\n RadiumBlock\n\n ```\n wss://polkadot.public.curie.radiumblock.co/ws\n ```\n\n ---\n\n RockX\n\n ```\n wss://rockx-dot.w3node.com/polka-public-dot/ws\n ```\n\n ---\n\n Stakeworld\n\n ```\n wss://dot-rpc.stakeworld.io\n ```\n\n ---\n\n SubQuery\n\n ```\n wss://polkadot.rpc.subquery.network/public/ws\n ```\n\n ---\n\n Light client\n\n ```\n light://substrate-connect/polkadot\n ```"} @@ -162,16 +162,16 @@ {"page_id": "develop-parachains-customize-parachain-overview", "page_title": "Overview of FRAME", "index": 7, "depth": 3, "title": "Parachain Templates", "anchor": "parachain-templates", "start_char": 7588, "end_char": 9140, "estimated_token_count": 338, "token_estimator": "heuristic-v1", "text": "### Parachain Templates\n\nParachain templates are specifically designed for chains that will connect to and interact with relay chains in the Polkadot ecosystem:\n\n- **[`parachain-template`](https://github.com/paritytech/polkadot-sdk/tree/master/templates/parachain){target=\\_blank}**: Designed for connecting to relay chains like Polkadot, Kusama, or Paseo, this template enables a chain to operate as a parachain. For projects aiming to integrate with Polkadot’s ecosystem, this template offers a great starting point.\n\n- **[`OpenZeppelin`](https://github.com/OpenZeppelin/polkadot-runtime-templates/tree/main){target=\\_blank}**: Offers two flexible starting points.\n - The [`generic-runtime-template`](https://github.com/OpenZeppelin/polkadot-runtime-templates/tree/main/generic-template){target=\\_blank} provides a minimal setup with essential pallets and secure defaults, creating a reliable foundation for custom blockchain development.\n - The [`evm-runtime-template`](https://github.com/OpenZeppelin/polkadot-runtime-templates/tree/main/evm-template){target=\\_blank} enables EVM compatibility, allowing developers to migrate Solidity contracts and EVM-based dApps. This template is ideal for Ethereum developers looking to leverage Substrate's capabilities.\n\nChoosing a suitable template depends on your project’s unique requirements, level of customization, and integration needs. Starting from a template speeds up development and lets you focus on implementing your chain’s unique features rather than the foundational blockchain setup."} {"page_id": "develop-parachains-customize-parachain-overview", "page_title": "Overview of FRAME", "index": 8, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 9140, "end_char": 9439, "estimated_token_count": 71, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\nFor more detailed information on implementing this process, refer to the following sections:\n\n- [Add a Pallet to Your Runtime](/develop/parachains/customize-parachain/add-existing-pallets/)\n- [Create a Custom Pallet](/develop/parachains/customize-parachain/make-custom-pallet/)"} {"page_id": "develop-parachains-customize-parachain", "page_title": "Customize Your Parachain", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 671, "end_char": 721, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-parachains-customize-parachain", "page_title": "Customize Your Parachain", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 721, "end_char": 1899, "estimated_token_count": 323, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-parachains-customize-parachain", "page_title": "Customize Your Parachain", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 721, "end_char": 1866, "estimated_token_count": 314, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 33, "end_char": 1201, "estimated_token_count": 203, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nBy default, the Rust compiler produces optimized Wasm binaries. These binaries are suitable for working in an isolated environment, such as local development. However, the Wasm binaries the compiler builds by default aren't guaranteed to be deterministically reproducible. Each time the compiler generates the Wasm runtime, it might produce a slightly different Wasm byte code. This is problematic in a blockchain network where all nodes must use exactly the same raw chain specification file.\n\nWorking with builds that aren't guaranteed to be deterministically reproducible can cause other problems, too. For example, for automating the build processes for a blockchain, it is ideal that the same code always produces the same result (in terms of bytecode). Compiling the Wasm runtime with every push would produce inconsistent and unpredictable results without a deterministic build, making it difficult to integrate with any automation and likely to break a CI/CD pipeline continuously. Deterministic builds—code that always compiles to exactly the same bytecode—ensure that the Wasm runtime can be inspected, audited, and independently verified."} {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 1201, "end_char": 1327, "estimated_token_count": 37, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore you begin, ensure you have [Docker](https://www.docker.com/get-started/){target=\\_blank} installed."} {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 2, "depth": 2, "title": "Tooling for Wasm Runtime", "anchor": "tooling-for-wasm-runtime", "start_char": 1327, "end_char": 2108, "estimated_token_count": 173, "token_estimator": "heuristic-v1", "text": "## Tooling for Wasm Runtime\n\nTo compile the Wasm runtime deterministically, the same tooling that produces the runtime for Polkadot, Kusama, and other Polkadot SDK-based chains can be used. This tooling, referred to collectively as the Substrate Runtime Toolbox or [`srtool`](https://github.com/paritytech/srtool){target=\\_blank}, ensures that the same source code consistently compiles to an identical Wasm blob.\n\nThe core component of `srtool` is a Docker container executed as part of a Docker image. The name of the `srtool` Docker image specifies the version of the Rust compiler used to compile the code included in the image. For example, the image `paritytech/srtool:1.88.0` indicates that the code in the image was compiled with version `1.88.0` of the `rustc` compiler."} {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 3, "depth": 2, "title": "Working with the Docker Container", "anchor": "working-with-the-docker-container", "start_char": 2108, "end_char": 3426, "estimated_token_count": 337, "token_estimator": "heuristic-v1", "text": "## Working with the Docker Container\n\nThe [`srtool-cli`](https://github.com/chevdor/srtool-cli){target=\\_blank} package is a command-line utility written in Rust that installs an executable program called `srtool`. This program simplifies the interactions with the `srtool` Docker container.\n\nOver time, the tooling around the `srtool` Docker image has expanded to include the following tools and helper programs:\n\n- **[`srtool-cli`](https://github.com/chevdor/srtool-cli){target=\\_blank}**: Provides a command-line interface to pull the srtool Docker image, get information about the image and tooling used to interact with it, and build the runtime using the `srtool` Docker container.\n- **[`subwasm`](https://github.com/chevdor/subwasm){target=\\_blank}**: Provides command-line options for working with the metadata and Wasm runtime built using srtool. The `subwasm` program is also used internally to perform tasks in the `srtool` image.\n- **[`srtool-actions`](https://github.com/chevdor/srtool-actions){target=\\_blank}**: Provides GitHub actions to integrate builds produced using the `srtool` image with your GitHub CI/CD pipelines.\n- **[`srtool-app`](https://gitlab.com/chevdor/srtool-app){target=\\_blank}**: Provides a simple graphical user interface for building the runtime using the `srtool` Docker image."} {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 4, "depth": 2, "title": "Prepare the Environment", "anchor": "prepare-the-environment", "start_char": 3426, "end_char": 4413, "estimated_token_count": 235, "token_estimator": "heuristic-v1", "text": "## Prepare the Environment\n\nIt is recommended to install the `srtool-cli` program to work with the Docker image using a simple command-line interface.\n\nTo prepare the environment:\n\n1. Verify that Docker is installed by running the following command:\n\n ```bash\n docker --version\n ```\n\n If Docker is installed, the command will display version information:\n\n
\n docker --version\n Docker version 20.10.17, build 100c701\n
\n\n2. Install the `srtool` command-line interface by running the following command:\n\n ```bash\n cargo install --git https://github.com/chevdor/srtool-cli\n ```\n\n3. View usage information for the `srtool` command-line interface by running the following command:\n\n ```bash\n srtool help\n ```\n\n4. Download the latest `srtool` Docker image by running the following command:\n\n ```bash\n srtool pull\n ```"} {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 5, "depth": 2, "title": "Start a Deterministic Build", "anchor": "start-a-deterministic-build", "start_char": 4413, "end_char": 5316, "estimated_token_count": 212, "token_estimator": "heuristic-v1", "text": "## Start a Deterministic Build\n\nAfter preparing the environment, the Wasm runtime can be compiled using the `srtool` Docker image.\n\nTo build the runtime, you need to open your Polkadot SDK-based project in a terminal shell and run the following command:\n\n```bash\nsrtool build --app --package INSERT_RUNTIME_PACKAGE_NAME --runtime-dir INSERT_RUNTIME_PATH \n```\n\n- The name specified for the `--package` should be the name defined in the `Cargo.toml` file for the runtime.\n- The path specified for the `--runtime-dir` should be the path to the `Cargo.toml` file for the runtime. For example:\n\n ```plain\n node/\n pallets/\n runtime/\n ├──lib.rs\n └──Cargo.toml # INSERT_RUNTIME_PATH should be the path to this file\n ...\n ```\n\n- If the `Cargo.toml` file for the runtime is located in a `runtime` subdirectory, for example, `runtime/kusama`, the `--runtime-dir` parameter can be omitted."} -{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 6, "depth": 2, "title": "Use srtool in GitHub Actions", "anchor": "use-srtool-in-github-actions", "start_char": 5316, "end_char": 6984, "estimated_token_count": 376, "token_estimator": "heuristic-v1", "text": "## Use srtool in GitHub Actions\n\nTo add a GitHub workflow for building the runtime:\n\n1. Create a `.github/workflows` directory in the chain's directory.\n2. In the `.github/workflows` directory, click **Add file**, then select **Create new file**.\n3. Copy the sample GitHub action from `basic.yml` example in the [`srtools-actions`](https://github.com/chevdor/srtool-actions){target=\\_blank} repository and paste it into the file you created in the previous step.\n\n ??? interface \"`basic.yml`\"\n\n {% raw %}\n ```yml\n name: Srtool build\n\n on: push\n\n jobs:\n srtool:\n runs-on: ubuntu-latest\n strategy:\n matrix:\n chain: [\"asset-hub-kusama\", \"asset-hub-westend\"]\n steps:\n - uses: actions/checkout@v3\n - name: Srtool build\n id: srtool_build\n uses: chevdor/srtool-actions@v0.8.0\n with:\n chain: ${{ matrix.chain }}\n runtime_dir: polkadot-parachains/${{ matrix.chain }}-runtime\n - name: Summary\n run: |\n echo '${{ steps.srtool_build.outputs.json }}' | jq . > ${{ matrix.chain }}-srtool-digest.json\n cat ${{ matrix.chain }}-srtool-digest.json\n echo \"Runtime location: ${{ steps.srtool_build.outputs.wasm }}\"\n ```\n {% endraw %}\n\n4. Modify the settings in the sample action.\n\n For example, modify the following settings:\n\n - The name of the chain.\n - The name of the runtime package.\n - The location of the runtime.\n\n5. Type a name for the action file and commit."} -{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 7, "depth": 2, "title": "Use the srtool Image via Docker Hub", "anchor": "use-the-srtool-image-via-docker-hub", "start_char": 6984, "end_char": 7774, "estimated_token_count": 215, "token_estimator": "heuristic-v1", "text": "## Use the srtool Image via Docker Hub\n\nIf utilizing [`srtool-cli`](https://github.com/chevdor/srtool-cli){target=\\_blank} or [`srtool-app`](https://gitlab.com/chevdor/srtool-app){target=\\_blank} isn't an option, the `paritytech/srtool` container image can be used directly via Docker Hub.\n\nTo pull the image from Docker Hub:\n\n1. Sign in to Docker Hub.\n2. Type `paritytech/srtool` in the search field and press enter.\n3. Click **paritytech/srtool**, then click **Tags**.\n4. Copy the command for the image you want to pull.\n5. Open a terminal shell on your local computer.\n6. Paste the command you copied from the Docker Hub. For example, you might run a command similar to the following, which downloads and unpacks the image:\n\n ```bash\n docker pull paritytech/srtool:1.88.0\n ```"} -{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 8, "depth": 3, "title": "Naming Convention for Images", "anchor": "naming-convention-for-images", "start_char": 7774, "end_char": 8470, "estimated_token_count": 156, "token_estimator": "heuristic-v1", "text": "### Naming Convention for Images\n\nKeep in mind that there is no `latest` tag for the `srtool` image. Ensure that the image selected is compatible with the locally available version of the Rust compiler.\n\nThe naming convention for `paritytech/srtool` Docker images specifies the version of the Rust compiler used to compile the code included in the image. Some images specify both a compiler version and the version of the build script used. For example, an image named `paritytech/srtool:1.62.0-0.9.19` was compiled with version `1.62.0` of the `rustc` compiler and version `0.9.19` of the build script. Images that only specify the compiler version always contain the software's latest version."} +{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 6, "depth": 2, "title": "Use srtool in GitHub Actions", "anchor": "use-srtool-in-github-actions", "start_char": 5316, "end_char": 6136, "estimated_token_count": 203, "token_estimator": "heuristic-v1", "text": "## Use srtool in GitHub Actions\n\nTo add a GitHub workflow for building the runtime:\n\n1. Create a `.github/workflows` directory in the chain's directory.\n2. In the `.github/workflows` directory, click **Add file**, then select **Create new file**.\n3. Copy the sample GitHub action from `basic.yml` example in the [`srtools-actions`](https://github.com/chevdor/srtool-actions){target=\\_blank} repository and paste it into the file you created in the previous step.\n\n ??? interface \"`basic.yml`\"\n\n {% raw %}\n ```yml\n \n ```\n {% endraw %}\n\n4. Modify the settings in the sample action.\n\n For example, modify the following settings:\n\n - The name of the chain.\n - The name of the runtime package.\n - The location of the runtime.\n\n5. Type a name for the action file and commit."} +{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 7, "depth": 2, "title": "Use the srtool Image via Docker Hub", "anchor": "use-the-srtool-image-via-docker-hub", "start_char": 6136, "end_char": 6926, "estimated_token_count": 215, "token_estimator": "heuristic-v1", "text": "## Use the srtool Image via Docker Hub\n\nIf utilizing [`srtool-cli`](https://github.com/chevdor/srtool-cli){target=\\_blank} or [`srtool-app`](https://gitlab.com/chevdor/srtool-app){target=\\_blank} isn't an option, the `paritytech/srtool` container image can be used directly via Docker Hub.\n\nTo pull the image from Docker Hub:\n\n1. Sign in to Docker Hub.\n2. Type `paritytech/srtool` in the search field and press enter.\n3. Click **paritytech/srtool**, then click **Tags**.\n4. Copy the command for the image you want to pull.\n5. Open a terminal shell on your local computer.\n6. Paste the command you copied from the Docker Hub. For example, you might run a command similar to the following, which downloads and unpacks the image:\n\n ```bash\n docker pull paritytech/srtool:1.88.0\n ```"} +{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 8, "depth": 3, "title": "Naming Convention for Images", "anchor": "naming-convention-for-images", "start_char": 6926, "end_char": 7622, "estimated_token_count": 156, "token_estimator": "heuristic-v1", "text": "### Naming Convention for Images\n\nKeep in mind that there is no `latest` tag for the `srtool` image. Ensure that the image selected is compatible with the locally available version of the Rust compiler.\n\nThe naming convention for `paritytech/srtool` Docker images specifies the version of the Rust compiler used to compile the code included in the image. Some images specify both a compiler version and the version of the build script used. For example, an image named `paritytech/srtool:1.62.0-0.9.19` was compiled with version `1.62.0` of the `rustc` compiler and version `0.9.19` of the build script. Images that only specify the compiler version always contain the software's latest version."} {"page_id": "develop-parachains-deployment-coretime-renewal", "page_title": "Coretime Renewal", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 20, "end_char": 454, "estimated_token_count": 75, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nCoretime can be purchased in bulk for a period of 28 days, providing access to Polkadot's shared security and interoperability for Polkadot parachains. The bulk purchase of coretime includes a rent-control mechanism that keeps future purchases within a predictable price range of the initial purchase. This allows cores to be renewed at a known price without competing against other participants in the open market."} {"page_id": "develop-parachains-deployment-coretime-renewal", "page_title": "Coretime Renewal", "index": 1, "depth": 2, "title": "Bulk Sale Phases", "anchor": "bulk-sale-phases", "start_char": 454, "end_char": 1474, "estimated_token_count": 218, "token_estimator": "heuristic-v1", "text": "## Bulk Sale Phases\n\nThe bulk sale process consists of three distinct phases:\n\n1. **Interlude phase**: The period between bulk sales when renewals are prioritized.\n2. **Lead-in phase**: Following the interlude phase, a new `start_price` is set, and a Dutch auction begins, lasting for `leadin_length` blocks. During this phase, prices experience downward pressure as the system aims to find market equilibrium. The final price at the end of this phase becomes the `regular_price`, which will be used in the subsequent fixed price phase.\n3. **Fixed price phase**: The final phase where remaining cores are sold at the `regular_price` established during the lead-in phase. This provides a stable and predictable pricing environment for participants who did not purchase during the price discovery period.\n\nFor more comprehensive information about the coretime sales process, refer to the [Coretime Sales](https://wiki.polkadot.com/learn/learn-agile-coretime/#coretime-sales){target=\\_blank} section in the Polkadot Wiki."} {"page_id": "develop-parachains-deployment-coretime-renewal", "page_title": "Coretime Renewal", "index": 2, "depth": 2, "title": "Renewal Timing", "anchor": "renewal-timing", "start_char": 1474, "end_char": 2107, "estimated_token_count": 118, "token_estimator": "heuristic-v1", "text": "## Renewal Timing\n\nWhile renewals can technically be made during any phase, it is strongly recommended that they be completed during the interlude phase. Delaying renewal introduces the risk that the core could be sold to another market participant, preventing successful renewal. Renewals must be initiated well in advance to avoid the scenario above. \n\nFor example, if you purchase a core in bulk sale #1, you obtain coretime for the upcoming bulk period (during which bulk sale #2 takes place).\nYour renewal must be completed during bulk sale #2, ideally during its interlude phase, to secure coretime for the subsequent period."} @@ -208,7 +208,7 @@ {"page_id": "develop-parachains-deployment-obtain-coretime", "page_title": "Obtain Coretime", "index": 5, "depth": 3, "title": "On-demand Coretime", "anchor": "on-demand-coretime", "start_char": 3260, "end_char": 3713, "estimated_token_count": 97, "token_estimator": "heuristic-v1", "text": "### On-demand Coretime\n\nOn-demand coretime allows for flexible, as-needed block production. To purchase:\n\n1. Ensure your collator node is fully synchronized with the relay chain.\n2. Submit the `onDemand.placeOrderAllowDeath` extrinsic on the relay chain with:\n\n - **`maxAmountFor`**: Sufficient funds for the transaction.\n - **`paraId`**: Your registered `ParaID`.\n\nAfter successfully executing the extrinsic, your parachain will produce a block."} {"page_id": "develop-parachains-deployment", "page_title": "Deployment", "index": 0, "depth": 2, "title": "Deployment Process", "anchor": "deployment-process", "start_char": 371, "end_char": 4371, "estimated_token_count": 837, "token_estimator": "heuristic-v1", "text": "## Deployment Process\n\nTaking your Polkadot SDK-based blockchain from a local environment to production involves several steps, ensuring your network is stable, secure, and ready for real-world use. The following diagram outlines the process at a high level:\n\n```mermaid\nflowchart TD\n %% Group 1: Pre-Deployment\n subgraph group1 [Pre-Deployment]\n direction LR\n A(\"Local
Development
and Testing\") --> B(\"Runtime
Compilation\")\n B --> C(\"Generate
Chain
Specifications\")\n C --> D(\"Prepare
Deployment
Environment\")\n D --> E(\"Acquire
Coretime\")\n end\n \n %% Group 2: Deployment\n subgraph group2 [Deployment]\n F(\"Launch
and
Monitor\")\n end\n\n %% Group 3: Post-Deployment\n subgraph group3 [Post-Deployment]\n G(\"Maintenance
and
Upgrades\")\n end\n\n %% Connections Between Groups\n group1 --> group2\n group2 --> group3\n\n %% Styling\n style group1 stroke:#6e7391,stroke-width:1px\n style group2 stroke:#6e7391,stroke-width:1px\n style group3 stroke:#6e7391,stroke-width:1px\n```\n\n- **Local development and testing**: The process begins with local development and testing. Developers focus on building the runtime by selecting and configuring the necessary pallets while refining network features. In this phase, running a local TestNet is essential to verify transactions and ensure the blockchain behaves as expected. Unit and integration tests ensure the network works as expected before launch. Thorough testing is conducted, not only for individual components but also for interactions between pallets.\n\n- **Runtime compilation**: Polkadot SDK-based blockchains are built with Wasm, a highly portable and efficient format. Compiling your blockchain's runtime into Wasm ensures it can be executed reliably across various environments, guaranteeing network-wide compatibility and security. The [srtool](https://github.com/paritytech/srtool){target=\\_blank} is helpful for this purpose since it allows you to compile [deterministic runtimes](/develop/parachains/deployment/build-deterministic-runtime/){target=\\_blank}.\n\n- **Generate chain specifications**: The chain spec file defines the structure and configuration of your blockchain. It includes initial node identities, session keys, and other parameters. Defining a well-thought-out chain specification ensures that your network will operate smoothly and according to your intended design.\n\n- **Deployment environment**: Whether launching a local test network or a production-grade blockchain, selecting the proper infrastructure is vital. For further information about these topics, see the [Infrastructure](/infrastructure/){target=\\_blank} section.\n\n- **Acquire coretime**: To build on top of the Polkadot network, users need to acquire coretime (either on-demand or in bulk) to access the computational resources of the relay chain. This allows for the secure validation of parachain blocks through a randomized selection of relay chain validators.\n\n If you’re building a standalone blockchain (solochain) that won’t connect to Polkadot as a parachain, you can skip the preceding step, as there’s no need to acquire coretime or implement [Cumulus](/develop/parachains/#cumulus){target=\\_blank}.\n\n- **Launch and monitor**: Once everything is configured, you can launch the blockchain, initiating the network with your chain spec and Wasm runtime. Validators or collators will begin producing blocks, and the network will go live. Post-launch, monitoring is vital to ensuring network health—tracking block production, node performance, and overall security.\n\n- **Maintenance and upgrade**: A blockchain continues to evolve post-deployment. As the network expands and adapts, it may require runtime upgrades, governance updates, coretime renewals, and even modifications to the underlying code. For an in-depth guide on this topic, see the [Maintenance](/develop/parachains/maintenance/){target=\\_blank} section."} {"page_id": "develop-parachains-deployment", "page_title": "Deployment", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 4371, "end_char": 4421, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-parachains-deployment", "page_title": "Deployment", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 4421, "end_char": 4806, "estimated_token_count": 111, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-parachains-deployment", "page_title": "Deployment", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 4421, "end_char": 4795, "estimated_token_count": 108, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-parachains-install-polkadot-sdk", "page_title": "Install Polkadot SDK Dependencies", "index": 0, "depth": 2, "title": "macOS", "anchor": "macos", "start_char": 324, "end_char": 463, "estimated_token_count": 25, "token_estimator": "heuristic-v1", "text": "## macOS\n\nYou can install Rust and set up a Substrate development environment on Apple macOS computers with Intel or Apple M1 processors."} {"page_id": "develop-parachains-install-polkadot-sdk", "page_title": "Install Polkadot SDK Dependencies", "index": 1, "depth": 3, "title": "Before You Begin", "anchor": "before-you-begin", "start_char": 463, "end_char": 1915, "estimated_token_count": 334, "token_estimator": "heuristic-v1", "text": "### Before You Begin\n\nBefore you install Rust and set up your development environment on macOS, verify that your computer meets the following basic requirements:\n\n- Operating system version is 10.7 Lion or later.\n- Processor speed of at least 2 GHz. Note that 3 GHz is recommended.\n- Memory of at least 8 GB RAM. Note that 16 GB is recommended.\n- Storage of at least 10 GB of available space.\n- Broadband Internet connection.\n\n#### Install Homebrew\n\nIn most cases, you should use Homebrew to install and manage packages on macOS computers. If you don't already have Homebrew installed on your local computer, you should download and install it before continuing.\n\nTo install Homebrew:\n\n1. Open the Terminal application.\n2. Download and install Homebrew by running the following command:\n\n ```bash\n /bin/bash -c \"$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)\"\n ```\n\n3. Verify Homebrew has been successfully installed by running the following command:\n\n ```bash\n brew --version\n ```\n\n The command displays output similar to the following:\n\n
\n brew --version\n Homebrew 4.3.15\n
\n\n#### Support for Apple Silicon\n\nProtobuf must be installed before the build process can begin. To install it, run the following command:\n\n```bash\nbrew install protobuf\n```"} {"page_id": "develop-parachains-install-polkadot-sdk", "page_title": "Install Polkadot SDK Dependencies", "index": 2, "depth": 3, "title": "Install Required Packages and Rust", "anchor": "install-required-packages-and-rust", "start_char": 1915, "end_char": 3306, "estimated_token_count": 291, "token_estimator": "heuristic-v1", "text": "### Install Required Packages and Rust\n\nBecause the blockchain requires standard cryptography to support the generation of public/private key pairs and the validation of transaction signatures, you must also have a package that provides cryptography, such as `openssl`.\n\nTo install `openssl` and the Rust toolchain on macOS:\n\n1. Open the Terminal application.\n2. Ensure you have an updated version of Homebrew by running the following command:\n\n ```bash\n brew update\n ```\n\n3. Install the `openssl` package by running the following command:\n\n ```bash\n brew install openssl\n ```\n\n4. Download the `rustup` installation program and use it to install Rust by running the following command:\n\n ```bash\n curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh\n ```\n\n5. Follow the prompts displayed to proceed with a default installation.\n6. Update your current shell to include Cargo by running the following command:\n\n ```bash\n source ~/.cargo/env\n ```\n\n7. Configure the Rust toolchain to default to the latest stable version by running the following commands:\n\n ```bash\n rustup default stable\n rustup update\n rustup target add wasm32-unknown-unknown\n rustup component add rust-src\n ```\n\n8. [Verify your installation](#verifying-installation).\n9. Install `cmake` using the following command:\n\n ```bash\n brew install cmake\n ```"} @@ -252,11 +252,11 @@ {"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 2, "depth": 2, "title": "Implement Storage Migrations", "anchor": "implement-storage-migrations", "start_char": 4349, "end_char": 4975, "estimated_token_count": 155, "token_estimator": "heuristic-v1", "text": "## Implement Storage Migrations\n\nThe [`OnRuntimeUpgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.OnRuntimeUpgrade.html){target=\\_blank} trait provides the foundation for implementing storage migrations in your runtime. Here's a detailed look at its essential functions:\n\n```rust\npub trait OnRuntimeUpgrade {\n fn on_runtime_upgrade() -> Weight { ... }\n fn try_on_runtime_upgrade(checks: bool) -> Result { ... }\n fn pre_upgrade() -> Result, TryRuntimeError> { ... }\n fn post_upgrade(_state: Vec) -> Result<(), TryRuntimeError> { ... }\n}\n```"} {"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 3, "depth": 3, "title": "Core Migration Function", "anchor": "core-migration-function", "start_char": 4975, "end_char": 6007, "estimated_token_count": 216, "token_estimator": "heuristic-v1", "text": "### Core Migration Function\n\nThe [`on_runtime_upgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.Hooks.html#method.on_runtime_upgrade){target=\\_blank} function executes when the FRAME Executive pallet detects a runtime upgrade. Important considerations when using this function include:\n\n- It runs before any pallet's `on_initialize` hooks.\n- Critical storage items (like [`block_number`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.block_number){target=\\_blank}) may not be set.\n- Execution is mandatory and must be completed.\n- Careful weight calculation is required to prevent bricking the chain.\n\nWhen implementing the migration logic, your code must handle several vital responsibilities. A migration implementation must do the following to operate correctly:\n\n- Read existing storage values in their original format.\n- Transform data to match the new format.\n- Write updated values back to storage.\n- Calculate and return consumed weight."} {"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 4, "depth": 3, "title": "Migration Testing Hooks", "anchor": "migration-testing-hooks", "start_char": 6007, "end_char": 8023, "estimated_token_count": 399, "token_estimator": "heuristic-v1", "text": "### Migration Testing Hooks\n\nThe `OnRuntimeUpgrade` trait provides some functions designed specifically for testing migrations. These functions never execute on-chain but are essential for validating migration behavior in test environments. The migration test hooks are as follows:\n\n- **[`try_on_runtime_upgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.OnRuntimeUpgrade.html#method.try_on_runtime_upgrade){target=\\_blank}**: This function serves as the primary orchestrator for testing the complete migration process. It coordinates the execution flow from `pre-upgrade` checks through the actual migration to `post-upgrade` verification. Handling the entire migration sequence ensures that storage modifications occur correctly and in the proper order. Preserving this sequence is particularly valuable when testing multiple dependent migrations, where the execution order matters.\n\n- **[`pre_upgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.Hooks.html#method.pre_upgrade){target=\\_blank}**: Before a runtime upgrade begins, the `pre_upgrade` function performs preliminary checks and captures the current state. It returns encoded state data that can be used for `post-upgrade` verification. This function must never modify storage: it should only read and verify the existing state. The data it returns includes critical state values that should remain consistent or transform predictably during migration.\n\n- **[`post_upgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.Hooks.html#method.post_upgrade){target=\\_blank}**: After the migration completes, `post_upgrade` validates its success. It receives the state data captured by `pre_upgrade` to verify that the migration was executed correctly. This function checks for storage consistency and ensures all data transformations are completed as expected. Like `pre_upgrade`, it operates exclusively in testing environments and should not modify storage."} -{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 5, "depth": 3, "title": "Migration Structure", "anchor": "migration-structure", "start_char": 8023, "end_char": 14864, "estimated_token_count": 1637, "token_estimator": "heuristic-v1", "text": "### Migration Structure\n\nThere are two approaches to implementing storage migrations. The first method involves directly implementing `OnRuntimeUpgrade` on structs. This approach requires manually checking the on-chain storage version against the new [`StorageVersion`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/struct.StorageVersion.html){target=\\_blank} and executing the transformation logic only when the check passes. This version verification prevents multiple executions of the migration during subsequent runtime upgrades.\n\nThe recommended approach is to implement [`UncheckedOnRuntimeUpgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.UncheckedOnRuntimeUpgrade.html){target=\\_blank} and wrap it with [`VersionedMigration`](https://paritytech.github.io/polkadot-sdk/master/frame_support/migrations/struct.VersionedMigration.html){target=\\_blank}. `VersionedMigration` implements `OnRuntimeUpgrade` and handles storage version management automatically, following best practices and reducing potential errors.\n\n`VersionedMigration` requires five type parameters:\n\n- **`From`**: The source version for the upgrade.\n- **`To`**: The target version for the upgrade.\n- **`Inner`**: The `UncheckedOnRuntimeUpgrade` implementation.\n- **`Pallet`**: The pallet being upgraded.\n- **`Weight`**: The runtime's [`RuntimeDbWeight`](https://paritytech.github.io/polkadot-sdk/master/frame_support/weights/struct.RuntimeDbWeight.html){target=\\_blank} implementation.\n\nExamine the following migration example that transforms a simple `StorageValue` storing a `u32` into a more complex structure that tracks both current and previous values using the `CurrentAndPreviousValue` struct:\n\n- Old `StorageValue` format:\n\n ```rust\n #[pallet::storage]\n pub type Value = StorageValue<_, u32>;\n ```\n\n- New `StorageValue` format:\n\n ```rust\n /// Example struct holding the most recently set [`u32`] and the\n /// second most recently set [`u32`] (if one existed).\n #[docify::export]\n #[derive(\n \tClone, Eq, PartialEq, Encode, Decode, RuntimeDebug, scale_info::TypeInfo, MaxEncodedLen,\n )]\n pub struct CurrentAndPreviousValue {\n \t/// The most recently set value.\n \tpub current: u32,\n \t/// The previous value, if one existed.\n \tpub previous: Option,\n }\n #[pallet::storage]\n \tpub type Value = StorageValue<_, CurrentAndPreviousValue>;\n ```\n\n- Migration:\n\n ```rust\n use frame_support::{\n \tstorage_alias,\n \ttraits::{Get, UncheckedOnRuntimeUpgrade},\n };\n\n #[cfg(feature = \"try-runtime\")]\n use alloc::vec::Vec;\n\n /// Collection of storage item formats from the previous storage version.\n ///\n /// Required so we can read values in the v0 storage format during the migration.\n mod v0 {\n \tuse super::*;\n\n \t/// V0 type for [`crate::Value`].\n \t#[storage_alias]\n \tpub type Value = StorageValue, u32>;\n }\n\n /// Implements [`UncheckedOnRuntimeUpgrade`], migrating the state of this pallet from V0 to V1.\n ///\n /// In V0 of the template [`crate::Value`] is just a `u32`. In V1, it has been upgraded to\n /// contain the struct [`crate::CurrentAndPreviousValue`].\n ///\n /// In this migration, update the on-chain storage for the pallet to reflect the new storage\n /// layout.\n pub struct InnerMigrateV0ToV1(core::marker::PhantomData);\n\n impl UncheckedOnRuntimeUpgrade for InnerMigrateV0ToV1 {\n \t/// Return the existing [`crate::Value`] so we can check that it was correctly set in\n \t/// `InnerMigrateV0ToV1::post_upgrade`.\n \t#[cfg(feature = \"try-runtime\")]\n \tfn pre_upgrade() -> Result, sp_runtime::TryRuntimeError> {\n \t\tuse codec::Encode;\n\n \t\t// Access the old value using the `storage_alias` type\n \t\tlet old_value = v0::Value::::get();\n \t\t// Return it as an encoded `Vec`\n \t\tOk(old_value.encode())\n \t}\n\n \t/// Migrate the storage from V0 to V1.\n \t///\n \t/// - If the value doesn't exist, there is nothing to do.\n \t/// - If the value exists, it is read and then written back to storage inside a\n \t/// [`crate::CurrentAndPreviousValue`].\n \tfn on_runtime_upgrade() -> frame_support::weights::Weight {\n \t\t// Read the old value from storage\n \t\tif let Some(old_value) = v0::Value::::take() {\n \t\t\t// Write the new value to storage\n \t\t\tlet new = crate::CurrentAndPreviousValue { current: old_value, previous: None };\n \t\t\tcrate::Value::::put(new);\n \t\t\t// One read + write for taking the old value, and one write for setting the new value\n \t\t\tT::DbWeight::get().reads_writes(1, 2)\n \t\t} else {\n \t\t\t// No writes since there was no old value, just one read for checking\n \t\t\tT::DbWeight::get().reads(1)\n \t\t}\n \t}\n\n \t/// Verifies the storage was migrated correctly.\n \t///\n \t/// - If there was no old value, the new value should not be set.\n \t/// - If there was an old value, the new value should be a [`crate::CurrentAndPreviousValue`].\n \t#[cfg(feature = \"try-runtime\")]\n \tfn post_upgrade(state: Vec) -> Result<(), sp_runtime::TryRuntimeError> {\n \t\tuse codec::Decode;\n \t\tuse frame_support::ensure;\n\n \t\tlet maybe_old_value = Option::::decode(&mut &state[..]).map_err(|_| {\n \t\t\tsp_runtime::TryRuntimeError::Other(\"Failed to decode old value from storage\")\n \t\t})?;\n\n \t\tmatch maybe_old_value {\n \t\t\tSome(old_value) => {\n \t\t\t\tlet expected_new_value =\n \t\t\t\t\tcrate::CurrentAndPreviousValue { current: old_value, previous: None };\n \t\t\t\tlet actual_new_value = crate::Value::::get();\n\n \t\t\t\tensure!(actual_new_value.is_some(), \"New value not set\");\n \t\t\t\tensure!(\n \t\t\t\t\tactual_new_value == Some(expected_new_value),\n \t\t\t\t\t\"New value not set correctly\"\n \t\t\t\t);\n \t\t\t},\n \t\t\tNone => {\n \t\t\t\tensure!(crate::Value::::get().is_none(), \"New value unexpectedly set\");\n \t\t\t},\n \t\t};\n \t\tOk(())\n \t}\n }\n\n /// [`UncheckedOnRuntimeUpgrade`] implementation [`InnerMigrateV0ToV1`] wrapped in a\n /// [`VersionedMigration`](frame_support::migrations::VersionedMigration), which ensures that:\n /// - The migration only runs once when the on-chain storage version is 0\n /// - The on-chain storage version is updated to `1` after the migration executes\n /// - Reads/Writes from checking/settings the on-chain storage version are accounted for\n pub type MigrateV0ToV1 = frame_support::migrations::VersionedMigration<\n \t0, // The migration will only execute when the on-chain storage version is 0\n \t1, // The on-chain storage version will be set to 1 after the migration is complete\n \tInnerMigrateV0ToV1,\n \tcrate::pallet::Pallet,\n \t::DbWeight,\n >;\n ```"} -{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 6, "depth": 3, "title": "Migration Organization", "anchor": "migration-organization", "start_char": 14864, "end_char": 15556, "estimated_token_count": 148, "token_estimator": "heuristic-v1", "text": "### Migration Organization\n\nBest practices recommend organizing migrations in a separate module within your pallet. Here's the recommended file structure:\n\n```plain\nmy-pallet/\n├── src/\n│ ├── lib.rs # Main pallet implementation\n│ └── migrations/ # All migration-related code\n│ ├── mod.rs # Migrations module definition\n│ ├── v1.rs # V0 -> V1 migration\n│ └── v2.rs # V1 -> V2 migration\n└── Cargo.toml\n```\n\nThis structure provides several benefits:\n\n- Separates migration logic from core pallet functionality.\n- Makes migrations easier to test and maintain.\n- Provides explicit versioning of storage changes.\n- Simplifies the addition of future migrations."} -{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 7, "depth": 3, "title": "Scheduling Migrations", "anchor": "scheduling-migrations", "start_char": 15556, "end_char": 16131, "estimated_token_count": 119, "token_estimator": "heuristic-v1", "text": "### Scheduling Migrations\n\nTo execute migrations during a runtime upgrade, you must configure them in your runtime's Executive pallet. Add your migrations in `runtime/src/lib.rs`:\n\n```rust\n/// Tuple of migrations (structs that implement `OnRuntimeUpgrade`)\ntype Migrations = (\n pallet_my_pallet::migrations::v1::Migration,\n // More migrations can be added here\n);\npub type Executive = frame_executive::Executive<\n Runtime,\n Block,\n frame_system::ChainContext,\n Runtime,\n AllPalletsWithSystem,\n Migrations, // Include migrations here\n>;\n\n```"} -{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 8, "depth": 2, "title": "Single-Block Migrations", "anchor": "single-block-migrations", "start_char": 16131, "end_char": 17219, "estimated_token_count": 196, "token_estimator": "heuristic-v1", "text": "## Single-Block Migrations\n\nSingle-block migrations execute their logic within one block immediately following a runtime upgrade. They run as part of the runtime upgrade process through the `OnRuntimeUpgrade` trait implementation and must be completed before any other runtime logic executes.\n\nWhile single-block migrations are straightforward to implement and provide immediate data transformation, they carry significant risks. The most critical consideration is that they must complete within one block's weight limits. This is especially crucial for parachains, where exceeding block weight limits will brick the chain.\n\nUse single-block migrations only when you can guarantee:\n\n- The migration has a bounded execution time.\n- Weight calculations are thoroughly tested.\n- Total weight will never exceed block limits.\n\nFor a complete implementation example of a single-block migration, refer to the [single-block migration example]( https://paritytech.github.io/polkadot-sdk/master/pallet_example_single_block_migrations/index.html){target=\\_blank} in the Polkadot SDK documentation."} -{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 9, "depth": 2, "title": "Multi Block Migrations", "anchor": "multi-block-migrations", "start_char": 17219, "end_char": 18500, "estimated_token_count": 230, "token_estimator": "heuristic-v1", "text": "## Multi Block Migrations\n\nMulti-block migrations distribute the migration workload across multiple blocks, providing a safer approach for production environments. The migration state is tracked in storage, allowing the process to pause and resume across blocks.\n\nThis approach is essential for production networks and parachains as the risk of exceeding block weight limits is eliminated. Multi-block migrations can safely handle large storage collections, unbounded data structures, and complex nested data types where weight consumption might be unpredictable.\n\nMulti-block migrations are ideal when dealing with:\n\n- Large-scale storage migrations.\n- Unbounded storage items or collections.\n- Complex data structures with uncertain weight costs.\n\nThe primary trade-off is increased implementation complexity, as you must manage the migration state and handle partial completion scenarios. However, multi-block migrations' significant safety benefits and operational reliability are typically worth the increased complexity.\n\nFor a complete implementation example of multi-block migrations, refer to the [official example](https://github.com/paritytech/polkadot-sdk/tree/polkadot-stable2506-2/substrate/frame/examples/multi-block-migrations){target=\\_blank} in the Polkadot SDK."} +{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 5, "depth": 3, "title": "Migration Structure", "anchor": "migration-structure", "start_char": 8023, "end_char": 10526, "estimated_token_count": 559, "token_estimator": "heuristic-v1", "text": "### Migration Structure\n\nThere are two approaches to implementing storage migrations. The first method involves directly implementing `OnRuntimeUpgrade` on structs. This approach requires manually checking the on-chain storage version against the new [`StorageVersion`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/struct.StorageVersion.html){target=\\_blank} and executing the transformation logic only when the check passes. This version verification prevents multiple executions of the migration during subsequent runtime upgrades.\n\nThe recommended approach is to implement [`UncheckedOnRuntimeUpgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.UncheckedOnRuntimeUpgrade.html){target=\\_blank} and wrap it with [`VersionedMigration`](https://paritytech.github.io/polkadot-sdk/master/frame_support/migrations/struct.VersionedMigration.html){target=\\_blank}. `VersionedMigration` implements `OnRuntimeUpgrade` and handles storage version management automatically, following best practices and reducing potential errors.\n\n`VersionedMigration` requires five type parameters:\n\n- **`From`**: The source version for the upgrade.\n- **`To`**: The target version for the upgrade.\n- **`Inner`**: The `UncheckedOnRuntimeUpgrade` implementation.\n- **`Pallet`**: The pallet being upgraded.\n- **`Weight`**: The runtime's [`RuntimeDbWeight`](https://paritytech.github.io/polkadot-sdk/master/frame_support/weights/struct.RuntimeDbWeight.html){target=\\_blank} implementation.\n\nExamine the following migration example that transforms a simple `StorageValue` storing a `u32` into a more complex structure that tracks both current and previous values using the `CurrentAndPreviousValue` struct:\n\n- Old `StorageValue` format:\n\n ```rust\n #[pallet::storage]\n pub type Value = StorageValue<_, u32>;\n ```\n\n- New `StorageValue` format:\n\n ```rust\n /// Example struct holding the most recently set [`u32`] and the\n /// second most recently set [`u32`] (if one existed).\n #[docify::export]\n #[derive(\n \tClone, Eq, PartialEq, Encode, Decode, RuntimeDebug, scale_info::TypeInfo, MaxEncodedLen,\n )]\n pub struct CurrentAndPreviousValue {\n \t/// The most recently set value.\n \tpub current: u32,\n \t/// The previous value, if one existed.\n \tpub previous: Option,\n }\n #[pallet::storage]\n \tpub type Value = StorageValue<_, CurrentAndPreviousValue>;\n ```\n\n- Migration:\n\n ```rust\n \n ```"} +{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 6, "depth": 3, "title": "Migration Organization", "anchor": "migration-organization", "start_char": 10526, "end_char": 11218, "estimated_token_count": 148, "token_estimator": "heuristic-v1", "text": "### Migration Organization\n\nBest practices recommend organizing migrations in a separate module within your pallet. Here's the recommended file structure:\n\n```plain\nmy-pallet/\n├── src/\n│ ├── lib.rs # Main pallet implementation\n│ └── migrations/ # All migration-related code\n│ ├── mod.rs # Migrations module definition\n│ ├── v1.rs # V0 -> V1 migration\n│ └── v2.rs # V1 -> V2 migration\n└── Cargo.toml\n```\n\nThis structure provides several benefits:\n\n- Separates migration logic from core pallet functionality.\n- Makes migrations easier to test and maintain.\n- Provides explicit versioning of storage changes.\n- Simplifies the addition of future migrations."} +{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 7, "depth": 3, "title": "Scheduling Migrations", "anchor": "scheduling-migrations", "start_char": 11218, "end_char": 11793, "estimated_token_count": 119, "token_estimator": "heuristic-v1", "text": "### Scheduling Migrations\n\nTo execute migrations during a runtime upgrade, you must configure them in your runtime's Executive pallet. Add your migrations in `runtime/src/lib.rs`:\n\n```rust\n/// Tuple of migrations (structs that implement `OnRuntimeUpgrade`)\ntype Migrations = (\n pallet_my_pallet::migrations::v1::Migration,\n // More migrations can be added here\n);\npub type Executive = frame_executive::Executive<\n Runtime,\n Block,\n frame_system::ChainContext,\n Runtime,\n AllPalletsWithSystem,\n Migrations, // Include migrations here\n>;\n\n```"} +{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 8, "depth": 2, "title": "Single-Block Migrations", "anchor": "single-block-migrations", "start_char": 11793, "end_char": 12881, "estimated_token_count": 196, "token_estimator": "heuristic-v1", "text": "## Single-Block Migrations\n\nSingle-block migrations execute their logic within one block immediately following a runtime upgrade. They run as part of the runtime upgrade process through the `OnRuntimeUpgrade` trait implementation and must be completed before any other runtime logic executes.\n\nWhile single-block migrations are straightforward to implement and provide immediate data transformation, they carry significant risks. The most critical consideration is that they must complete within one block's weight limits. This is especially crucial for parachains, where exceeding block weight limits will brick the chain.\n\nUse single-block migrations only when you can guarantee:\n\n- The migration has a bounded execution time.\n- Weight calculations are thoroughly tested.\n- Total weight will never exceed block limits.\n\nFor a complete implementation example of a single-block migration, refer to the [single-block migration example]( https://paritytech.github.io/polkadot-sdk/master/pallet_example_single_block_migrations/index.html){target=\\_blank} in the Polkadot SDK documentation."} +{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 9, "depth": 2, "title": "Multi Block Migrations", "anchor": "multi-block-migrations", "start_char": 12881, "end_char": 14162, "estimated_token_count": 230, "token_estimator": "heuristic-v1", "text": "## Multi Block Migrations\n\nMulti-block migrations distribute the migration workload across multiple blocks, providing a safer approach for production environments. The migration state is tracked in storage, allowing the process to pause and resume across blocks.\n\nThis approach is essential for production networks and parachains as the risk of exceeding block weight limits is eliminated. Multi-block migrations can safely handle large storage collections, unbounded data structures, and complex nested data types where weight consumption might be unpredictable.\n\nMulti-block migrations are ideal when dealing with:\n\n- Large-scale storage migrations.\n- Unbounded storage items or collections.\n- Complex data structures with uncertain weight costs.\n\nThe primary trade-off is increased implementation complexity, as you must manage the migration state and handle partial completion scenarios. However, multi-block migrations' significant safety benefits and operational reliability are typically worth the increased complexity.\n\nFor a complete implementation example of multi-block migrations, refer to the [official example](https://github.com/paritytech/polkadot-sdk/tree/polkadot-stable2506-2/substrate/frame/examples/multi-block-migrations){target=\\_blank} in the Polkadot SDK."} {"page_id": "develop-parachains-maintenance-unlock-parachain", "page_title": "Unlock a Parachain", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 22, "end_char": 1071, "estimated_token_count": 182, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nParachain locks are a critical security mechanism in the Polkadot ecosystem designed to maintain decentralization during the parachain lifecycle. These locks prevent potential centralization risks that could emerge during the early stages of parachain operation.\n\nThe locking system follows strict, well-defined conditions that distribute control across multiple authorities:\n\n- Relay chain governance has the authority to lock any parachain.\n- A parachain can lock its own lock.\n- Parachain managers have permission to lock the parachain.\n- Parachains are locked automatically when they successfully produce their first block.\n\nSimilarly, unlocking a parachain follows controlled procedures:\n\n- Relay chain governance retains the authority to unlock any parachain.\n- A parachain can unlock its own lock.\n\nThis document guides you through checking a parachain's lock status and safely executing the unlock procedure from a parachain using [XCM (Cross-Consensus Messaging)](/develop/interoperability/intro-to-xcm/){target=\\_blank}."} {"page_id": "develop-parachains-maintenance-unlock-parachain", "page_title": "Unlock a Parachain", "index": 1, "depth": 2, "title": "Check If the Parachain Is Locked", "anchor": "check-if-the-parachain-is-locked", "start_char": 1071, "end_char": 2100, "estimated_token_count": 262, "token_estimator": "heuristic-v1", "text": "## Check If the Parachain Is Locked\n\nBefore unlocking a parachain, you should verify its current lock status. This can be done through the Polkadot.js interface:\n\n1. In [Polkadot.js Apps](https://polkadot.js.org/apps/#/explorer){target=\\_blank}, connect to the relay chain, navigate to the **Developer** dropdown and select the **Chain State** option.\n\n2. Query the parachain locked status:\n 1. Select **`registrar`**.\n 2. Choose the **`paras`** option.\n 3. Input the parachain ID you want to check as a parameter (e.g. `2006`).\n 4. Click the **+** button to execute the query.\n 5. Check the status of the parachain lock.\n - **`manager`**: The account that has placed a deposit for registering this parachain.\n - **`deposit`**: The amount reserved by the `manager` account for the registration.\n - **`locked`**: Whether the parachain registration should be locked from being controlled by the manager.\n\n ![](/images/develop/parachains/maintenance/unlock-parachain/unlock-parachain-1.webp)"} {"page_id": "develop-parachains-maintenance-unlock-parachain", "page_title": "Unlock a Parachain", "index": 2, "depth": 2, "title": "How to Unlock a Parachain", "anchor": "how-to-unlock-a-parachain", "start_char": 2100, "end_char": 2755, "estimated_token_count": 121, "token_estimator": "heuristic-v1", "text": "## How to Unlock a Parachain\n\nUnlocking a parachain requires sending an XCM (Cross-Consensus Message) to the relay chain from the parachain itself, sending a message with Root origin, or this can be accomplished through the relay chain's governance mechanism, executing a root call.\n\nIf sending an XCM, the parachain origin must have proper authorization, typically from either the parachain's sudo pallet (if enabled) or its governance system.\n\nThis guide demonstrates the unlocking process using a parachain with the sudo pallet. For parachains using governance-based authorization instead, the process will require adjustments to how the XCM is sent."} @@ -264,16 +264,16 @@ {"page_id": "develop-parachains-maintenance-unlock-parachain", "page_title": "Unlock a Parachain", "index": 4, "depth": 3, "title": "Fund the Sovereign Account", "anchor": "fund-the-sovereign-account", "start_char": 4169, "end_char": 6045, "estimated_token_count": 414, "token_estimator": "heuristic-v1", "text": "### Fund the Sovereign Account\n\nFor a successful XCM execution, the [sovereign account](https://github.com/polkadot-fellows/xcm-format/blob/10726875bd3016c5e528c85ed6e82415e4b847d7/README.md?plain=1#L50){target=\\_blank} of your parachain on the relay chain must have sufficient funds to cover transaction fees. The sovereign account is a deterministic address derived from your parachain ID.\n\nYou can identify your parachain's sovereign account using either of these methods:\n\n=== \"Runtime API\"\n\n Execute the `locationToAccountApi.convertLocation` runtime API call to convert your parachain's location into its sovereign account address on the relay chain.\n\n ![](/images/develop/parachains/maintenance/unlock-parachain/unlock-parachain-7.webp)\n\n=== \"Substrate Utilities\"\n\n Use the **\"Para ID\" to Address** section in [Substrate Utilities](https://www.shawntabrizi.com/substrate-js-utilities/){target=\\_blank} with the **Child** option selected.\n\n=== \"Manual Calculation\"\n\n 1. Identify the appropriate prefix:\n\n - For parent/child chains use the prefix `0x70617261` (which decodes to `b\"para\"`).\n \n 2. Encode your parachain ID as a u32 [SCALE](/polkadot-protocol/parachain-basics/data-encoding#data-types){target=\\_blank} value:\n\n - For parachain 2006, this would be `d6070000`.\n\n 3. Combine the prefix with the encoded ID to form the sovereign account address:\n\n - **Hex**: `0x70617261d6070000000000000000000000000000000000000000000000000000`\n - **SS58 format**: `5Ec4AhPW97z4ZyYkd3mYkJrSeZWcwVv4wiANES2QrJi1x17F`\n\nYou can transfer funds to this account from any account on the relay chain using a standard transfer. To calculate the amount needed, refer to the [XCM Payment API](/develop/interoperability/xcm-runtime-apis/#xcm-payment-api){target=\\_blank}. The calculation will depend on the XCM built in the next step."} {"page_id": "develop-parachains-maintenance-unlock-parachain", "page_title": "Unlock a Parachain", "index": 5, "depth": 3, "title": "Craft and Submit the XCM", "anchor": "craft-and-submit-the-xcm", "start_char": 6045, "end_char": 9223, "estimated_token_count": 710, "token_estimator": "heuristic-v1", "text": "### Craft and Submit the XCM\n\nWith the call data prepared and the sovereign account funded, you can now construct and send the XCM from your parachain to the relay chain. The XCM will need to perform several operations in sequence:\n\n1. Withdraw DOT from your parachain's sovereign account.\n2. Buy execution to pay for transaction fees.\n3. Execute the `registrar.removeLock` extrinsic.\n4. Return any unused funds to your sovereign account.\n\nHere's how to submit this XCM using Astar (Parachain 2006) as an example:\n\n1. In [Polkadot.js Apps](https://polkadot.js.org/apps/#/explorer){target=\\_blank}, connect to the parachain, navigate to the **Developer** dropdown and select the **Extrinsics** option.\n\n2. Create a `sudo.sudo` extrinsic that executes `polkadotXcm.send`:\n\n 1. Use the `sudo.sudo` extrinsic to execute the following call as Root.\n 2. Select the **polkadotXcm** pallet.\n 3. Choose the **send** extrinsic.\n 4. Set the **dest** parameter as the relay chain.\n\n ![](/images/develop/parachains/maintenance/unlock-parachain/unlock-parachain-4.webp)\n\n3. Construct the XCM and submit it:\n\n 1. Add a **WithdrawAsset** instruction.\n 2. Add a **BuyExecution** instruction.\n - **fees**:\n - **id**: The asset location to use for the fee payment. In this example, the relay chain native asset is used.\n - **fun**: Select `Fungible` and use the same amount you withdrew from the sovereign account in the previous step.\n - **weightLimit**: Use `Unlimited`.\n 3. Add a **Transact** instruction with the following parameters:\n - **originKind**: Use `Native`.\n - **requireWeightAtMost**: Use the weight calculated previously.\n - **call**: Use the encoded call data generated before.\n 4. Add a **RefundSurplus** instruction.\n 5. Add a **DepositAsset** instruction to send the remaining funds to the parachain sovereign account.\n 6. Click the **Submit Transaction** button.\n\n ![](/images/develop/parachains/maintenance/unlock-parachain/unlock-parachain-5.webp)\n\n If the amount withdrawn in the first instruction is exactly the amount needed to pay the transaction fees, instructions 4 and 5 can be omitted.\n\n To validate your XCM, examine the following reference [extrinsic](https://polkadot.js.org/apps/?rpc=wss%3A%2F%2Fastar.public.curie.radiumblock.co%2Fws#/extrinsics/decode/0x63003300040100041400040000000700e40b5402130000000700e40b540200060042d3c91800184604d6070000140d0100000100591f){target=_blank} showing the proper instruction sequence and parameter formatting. Following this structure will help ensure successful execution of your message.\n\nAfter submitting the transaction, wait for it to be finalized and then verify that your parachain has been successfully unlocked by following the steps described in the [Check if the Parachain is Locked](#check-if-the-parachain-is-locked) section. If the parachain shows as unlocked, your operation has been successful. If it still appears locked, verify that your XCM transaction was processed correctly and consider troubleshooting the XCM built.\n\n![](/images/develop/parachains/maintenance/unlock-parachain/unlock-parachain-6.webp)"} {"page_id": "develop-parachains-maintenance", "page_title": "Maintenance", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 446, "end_char": 496, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-parachains-maintenance", "page_title": "Maintenance", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 496, "end_char": 1372, "estimated_token_count": 224, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-parachains-maintenance", "page_title": "Maintenance", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 496, "end_char": 1350, "estimated_token_count": 218, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 16, "end_char": 1221, "estimated_token_count": 239, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nBenchmarking is a critical component of developing efficient and secure blockchain runtimes. In the Polkadot ecosystem, accurately benchmarking your custom pallets ensures that each extrinsic has a precise [weight](/polkadot-protocol/glossary/#weight){target=\\_blank}, representing its computational and storage demands. This process is vital for maintaining the blockchain's performance and preventing potential vulnerabilities, such as Denial of Service (DoS) attacks.\n\nThe Polkadot SDK leverages the [FRAME](/polkadot-protocol/glossary/#frame-framework-for-runtime-aggregation-of-modularized-entities){target=\\_blank} benchmarking framework, offering tools to measure and assign weights to extrinsics. These weights help determine the maximum number of transactions or system-level calls processed within a block. This guide covers how to use FRAME's [benchmarking framework](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/v2/index.html){target=\\_blank}, from setting up your environment to writing and running benchmarks for your custom pallets. You'll understand how to generate accurate weights by the end, ensuring your runtime remains performant and secure."} {"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 1, "depth": 2, "title": "The Case for Benchmarking", "anchor": "the-case-for-benchmarking", "start_char": 1221, "end_char": 2015, "estimated_token_count": 114, "token_estimator": "heuristic-v1", "text": "## The Case for Benchmarking\n\nBenchmarking helps validate that the required execution time for different functions is within reasonable boundaries to ensure your blockchain runtime can handle transactions efficiently and securely. By accurately measuring the weight of each extrinsic, you can prevent service interruptions caused by computationally intensive calls that exceed block time limits. Without benchmarking, runtime performance could be vulnerable to DoS attacks, where malicious users exploit functions with unoptimized weights.\n\nBenchmarking also ensures predictable transaction fees. Weights derived from benchmark tests accurately reflect the resource usage of function calls, allowing fair fee calculation. This approach discourages abuse while maintaining network reliability."} {"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 2, "depth": 3, "title": "Benchmarking and Weight", "anchor": "benchmarking-and-weight", "start_char": 2015, "end_char": 3681, "estimated_token_count": 321, "token_estimator": "heuristic-v1", "text": "### Benchmarking and Weight \n\nIn Polkadot SDK-based chains, weight quantifies the computational effort needed to process transactions. This weight includes factors such as:\n\n- Computational complexity.\n- Storage complexity (proof size).\n- Database reads and writes.\n- Hardware specifications.\n\nBenchmarking uses real-world testing to simulate worst-case scenarios for extrinsics. The framework generates a linear model for weight calculation by running multiple iterations with varied parameters. These worst-case weights ensure blocks remain within execution limits, enabling the runtime to maintain throughput under varying loads. Excess fees can be refunded if a call uses fewer resources than expected, offering users a fair cost model.\n \nBecause weight is a generic unit of measurement based on computation time for a specific physical machine, the weight of any function can change based on the specifications of hardware used for benchmarking. By modeling the expected weight of each runtime function, the blockchain can calculate the number of transactions or system-level calls it can execute within a certain period.\n\nWithin FRAME, each function call that is dispatched must have a `#[pallet::weight]` annotation that can return the expected weight for the worst-case scenario execution of that function given its inputs:\n\n```rust hl_lines=\"2\"\n#[pallet::call_index(0)]\n#[pallet::weight(T::WeightInfo::do_something())]\npub fn do_something(origin: OriginFor) -> DispatchResultWithPostInfo { Ok(()) }\n```\n\nThe `WeightInfo` file is automatically generated during benchmarking. Based on these tests, this file provides accurate weights for each extrinsic."} {"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 3, "depth": 2, "title": "Benchmarking Process", "anchor": "benchmarking-process", "start_char": 3681, "end_char": 4224, "estimated_token_count": 98, "token_estimator": "heuristic-v1", "text": "## Benchmarking Process\n\nBenchmarking a pallet involves the following steps: \n\n1. Creating a `benchmarking.rs` file within your pallet's structure.\n2. Writing a benchmarking test for each extrinsic.\n3. Executing the benchmarking tool to calculate weights based on performance metrics.\n\nThe benchmarking tool runs multiple iterations to model worst-case execution times and determine the appropriate weight. By default, the benchmarking pipeline is deactivated. To activate it, compile your runtime with the `runtime-benchmarks` feature flag."} {"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 4, "depth": 3, "title": "Prepare Your Environment", "anchor": "prepare-your-environment", "start_char": 4224, "end_char": 5278, "estimated_token_count": 293, "token_estimator": "heuristic-v1", "text": "### Prepare Your Environment\n\nInstall the [`frame-omni-bencher`](https://crates.io/crates/frame-omni-bencher){target=\\_blank} command-line tool:\n\n```bash\ncargo install frame-omni-bencher\n```\n\nBefore writing benchmark tests, you need to ensure the `frame-benchmarking` crate is included in your pallet's `Cargo.toml` similar to the following:\n\n```toml title=\"Cargo.toml\"\nframe-benchmarking = { version = \"37.0.0\", default-features = false }\n```\n\nYou must also ensure that you add the `runtime-benchmarks` feature flag as follows under the `[features]` section of your pallet's `Cargo.toml`:\n\n```toml title=\"Cargo.toml\"\nruntime-benchmarks = [\n \"frame-benchmarking/runtime-benchmarks\",\n \"frame-support/runtime-benchmarks\",\n \"frame-system/runtime-benchmarks\",\n \"sp-runtime/runtime-benchmarks\",\n]\n```\n\nLastly, ensure that `frame-benchmarking` is included in `std = []`: \n\n```toml title=\"Cargo.toml\"\nstd = [\n # ...\n \"frame-benchmarking?/std\",\n # ...\n]\n```\n\nOnce complete, you have the required dependencies for writing benchmark tests for your pallet."} -{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 5, "depth": 3, "title": "Write Benchmark Tests", "anchor": "write-benchmark-tests", "start_char": 5278, "end_char": 7734, "estimated_token_count": 645, "token_estimator": "heuristic-v1", "text": "### Write Benchmark Tests\n\nCreate a `benchmarking.rs` file in your pallet's `src/`. Your directory structure should look similar to the following:\n\n```\nmy-pallet/\n├── src/\n│ ├── lib.rs # Main pallet implementation\n│ └── benchmarking.rs # Benchmarking\n└── Cargo.toml\n```\n\nWith the directory structure set, you can use the [`polkadot-sdk-parachain-template`](https://github.com/paritytech/polkadot-sdk-parachain-template/tree/master/pallets){target=\\_blank} to get started as follows:\n\n```rust title=\"benchmarking.rs (starter template)\"\n//! Benchmarking setup for pallet-template\n#![cfg(feature = \"runtime-benchmarks\")]\n\nuse super::*;\nuse frame_benchmarking::v2::*;\n\n#[benchmarks]\nmod benchmarks {\n\tuse super::*;\n\t#[cfg(test)]\n\tuse crate::pallet::Pallet as Template;\n\tuse frame_system::RawOrigin;\n\n\t#[benchmark]\n\tfn do_something() {\n\t\tlet caller: T::AccountId = whitelisted_caller();\n\t\t#[extrinsic_call]\n\t\tdo_something(RawOrigin::Signed(caller), 100);\n\n\t\tassert_eq!(Something::::get().map(|v| v.block_number), Some(100u32.into()));\n\t}\n\n\t#[benchmark]\n\tfn cause_error() {\n\t\tSomething::::put(CompositeStruct { block_number: 100u32.into() });\n\t\tlet caller: T::AccountId = whitelisted_caller();\n\t\t#[extrinsic_call]\n\t\tcause_error(RawOrigin::Signed(caller));\n\n\t\tassert_eq!(Something::::get().map(|v| v.block_number), Some(101u32.into()));\n\t}\n\n\timpl_benchmark_test_suite!(Template, crate::mock::new_test_ext(), crate::mock::Test);\n}\n```\n\nIn your benchmarking tests, employ these best practices:\n\n- **Write custom testing functions**: The function `do_something` in the preceding example is a placeholder. Similar to writing unit tests, you must write custom functions to benchmark test your extrinsics. Access the mock runtime and use functions such as `whitelisted_caller()` to sign transactions and facilitate testing.\n- **Use the `#[extrinsic_call]` macro**: This macro is used when calling the extrinsic itself and is a required part of a benchmarking function. See the [`extrinsic_call`](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/v2/index.html#extrinsic_call-and-block){target=\\_blank} docs for more details.\n- **Validate extrinsic behavior**: The `assert_eq` expression ensures that the extrinsic is working properly within the benchmark context.\n\nAdd the `benchmarking` module to your pallet. In the pallet `lib.rs` file add the following:\n\n```rust\n#[cfg(feature = \"runtime-benchmarks\")]\nmod benchmarking;\n```"} -{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 6, "depth": 3, "title": "Add Benchmarks to Runtime", "anchor": "add-benchmarks-to-runtime", "start_char": 7734, "end_char": 9863, "estimated_token_count": 418, "token_estimator": "heuristic-v1", "text": "### Add Benchmarks to Runtime\n\nBefore running the benchmarking tool, you must integrate benchmarks with your runtime as follows:\n\n1. Navigate to your `runtime/src` directory and check if a `benchmarks.rs` file exists. If not, create one. This file will contain the macro that registers all pallets for benchmarking along with their respective configurations:\n\n ```rust title=\"benchmarks.rs\"\n frame_benchmarking::define_benchmarks!(\n [frame_system, SystemBench::]\n [pallet_parachain_template, TemplatePallet]\n [pallet_balances, Balances]\n [pallet_session, SessionBench::]\n [pallet_timestamp, Timestamp]\n [pallet_message_queue, MessageQueue]\n [pallet_sudo, Sudo]\n [pallet_collator_selection, CollatorSelection]\n [cumulus_pallet_parachain_system, ParachainSystem]\n [cumulus_pallet_xcmp_queue, XcmpQueue]\n );\n ```\n\n For example, to add a new pallet named `pallet_parachain_template` for benchmarking, include it in the macro as shown:\n ```rust title=\"benchmarks.rs\" hl_lines=\"3\"\n frame_benchmarking::define_benchmarks!(\n [frame_system, SystemBench::]\n [pallet_parachain_template, TemplatePallet]\n );\n ```\n\n !!!warning \"Updating `define_benchmarks!` macro is required\"\n Any pallet that needs to be benchmarked must be included in the [`define_benchmarks!`](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/macro.define_benchmarks.html){target=\\_blank} macro. The CLI will only be able to access and benchmark pallets that are registered here.\n\n2. Check your runtime's `lib.rs` file to ensure the `benchmarks` module is imported. The import should look like this:\n\n ```rust title=\"lib.rs\"\n #[cfg(feature = \"runtime-benchmarks\")]\n mod benchmarks;\n ```\n\n The `runtime-benchmarks` feature gate ensures benchmark tests are isolated from production runtime code.\n\n3. Enable runtime benchmarking for your pallet in `runtime/Cargo.toml`:\n\n ```toml\n runtime-benchmarks = [\n # ...\n \"pallet_parachain_template/runtime-benchmarks\",\n ]\n\n ```"} -{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 7, "depth": 3, "title": "Run Benchmarks", "anchor": "run-benchmarks", "start_char": 9863, "end_char": 14248, "estimated_token_count": 1100, "token_estimator": "heuristic-v1", "text": "### Run Benchmarks\n\nYou can now compile your runtime with the `runtime-benchmarks` feature flag. This feature flag is crucial as the benchmarking tool will look for this feature being enabled to know when it should run benchmark tests. Follow these steps to compile the runtime with benchmarking enabled:\n\n1. Run `build` with the feature flag included:\n\n ```bash\n cargo build --features runtime-benchmarks --release\n ```\n\n2. Create a `weights.rs` file in your pallet's `src/` directory. This file will store the auto-generated weight calculations:\n\n ```bash\n touch weights.rs\n ```\n\n3. Before running the benchmarking tool, you'll need a template file that defines how weight information should be formatted. Download the official template from the Polkadot SDK repository and save it in your project folders for future use:\n\n ```bash\n curl https://raw.githubusercontent.com/paritytech/polkadot-sdk/refs/tags/polkadot-stable2412/substrate/.maintain/frame-weight-template.hbs \\\n --output ./pallets/benchmarking/frame-weight-template.hbs\n ```\n\n4. Run the benchmarking tool to measure extrinsic weights:\n\n ```bash\n frame-omni-bencher v1 benchmark pallet \\\n --runtime INSERT_PATH_TO_WASM_RUNTIME \\\n --pallet INSERT_NAME_OF_PALLET \\\n --extrinsic \"\" \\\n --template ./frame-weight-template.hbs \\\n --output weights.rs\n ```\n\n !!! tip \"Flag definitions\"\n - **`--runtime`**: The path to your runtime's Wasm.\n - **`--pallet`**: The name of the pallet you wish to benchmark. This pallet must be configured in your runtime and defined in `define_benchmarks`.\n - **`--extrinsic`**: Which extrinsic to test. Using `\"\"` implies all extrinsics will be benchmarked.\n - **`--template`**: Defines how weight information should be formatted.\n - **`--output`**: Where the output of the auto-generated weights will reside.\n\nThe generated `weights.rs` file contains weight annotations for your extrinsics, ready to be added to your pallet. The output should be similar to the following. Some output is omitted for brevity:\n\n
\n frame-omni-bencher v1 benchmark pallet \\\n --runtime INSERT_PATH_TO_WASM_RUNTIME \\\n --pallet \"INSERT_NAME_OF_PALLET\" \\\n --extrinsic \"\" \\\n --template ./frame-weight-template.hbs \\\n --output ./weights.rs\n ...\n 2025-01-15T16:41:33.557045Z INFO polkadot_sdk_frame::benchmark::pallet: [ 0 % ] Starting benchmark: pallet_parachain_template::do_something\n 2025-01-15T16:41:33.564644Z INFO polkadot_sdk_frame::benchmark::pallet: [ 50 % ] Starting benchmark: pallet_parachain_template::cause_error\n ...\n Created file: \"weights.rs\"\n \n
\n\n#### Add Benchmark Weights to Pallet\n\nOnce the `weights.rs` is generated, you must integrate it with your pallet. \n\n1. To begin the integration, import the `weights` module and the `WeightInfo` trait, then add both to your pallet's `Config` trait. Complete the following steps to set up the configuration:\n\n ```rust title=\"lib.rs\"\n pub mod weights;\n use crate::weights::WeightInfo;\n\n /// Configure the pallet by specifying the parameters and types on which it depends.\n #[pallet::config]\n pub trait Config: frame_system::Config {\n // ...\n /// A type representing the weights required by the dispatchables of this pallet.\n type WeightInfo: WeightInfo;\n }\n ```\n\n2. Next, you must add this to the `#[pallet::weight]` annotation in all the extrinsics via the `Config` as follows:\n\n ```rust hl_lines=\"2\" title=\"lib.rs\"\n #[pallet::call_index(0)]\n #[pallet::weight(T::WeightInfo::do_something())]\n pub fn do_something(origin: OriginFor) -> DispatchResultWithPostInfo { Ok(()) }\n ```\n\n3. Finally, configure the actual weight values in your runtime. In `runtime/src/config/mod.rs`, add the following code:\n\n ```rust title=\"mod.rs\"\n // Configure pallet.\n impl pallet_parachain_template::Config for Runtime {\n // ...\n type WeightInfo = pallet_parachain_template::weights::SubstrateWeight;\n }\n ```"} -{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 8, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 14248, "end_char": 14731, "estimated_token_count": 114, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n- View the Rust Docs for a more comprehensive, low-level view of the [FRAME V2 Benchmarking Suite](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/v2/index.html){target=_blank}.\n- Read the [FRAME Benchmarking and Weights](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/reference_docs/frame_benchmarking_weight/index.html){target=_blank} reference document, a concise guide which details how weights and benchmarking work."} +{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 5, "depth": 3, "title": "Write Benchmark Tests", "anchor": "write-benchmark-tests", "start_char": 5278, "end_char": 6838, "estimated_token_count": 377, "token_estimator": "heuristic-v1", "text": "### Write Benchmark Tests\n\nCreate a `benchmarking.rs` file in your pallet's `src/`. Your directory structure should look similar to the following:\n\n```\nmy-pallet/\n├── src/\n│ ├── lib.rs # Main pallet implementation\n│ └── benchmarking.rs # Benchmarking\n└── Cargo.toml\n```\n\nWith the directory structure set, you can use the [`polkadot-sdk-parachain-template`](https://github.com/paritytech/polkadot-sdk-parachain-template/tree/master/pallets){target=\\_blank} to get started as follows:\n\n```rust title=\"benchmarking.rs (starter template)\"\n\n```\n\nIn your benchmarking tests, employ these best practices:\n\n- **Write custom testing functions**: The function `do_something` in the preceding example is a placeholder. Similar to writing unit tests, you must write custom functions to benchmark test your extrinsics. Access the mock runtime and use functions such as `whitelisted_caller()` to sign transactions and facilitate testing.\n- **Use the `#[extrinsic_call]` macro**: This macro is used when calling the extrinsic itself and is a required part of a benchmarking function. See the [`extrinsic_call`](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/v2/index.html#extrinsic_call-and-block){target=\\_blank} docs for more details.\n- **Validate extrinsic behavior**: The `assert_eq` expression ensures that the extrinsic is working properly within the benchmark context.\n\nAdd the `benchmarking` module to your pallet. In the pallet `lib.rs` file add the following:\n\n```rust\n#[cfg(feature = \"runtime-benchmarks\")]\nmod benchmarking;\n```"} +{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 6, "depth": 3, "title": "Add Benchmarks to Runtime", "anchor": "add-benchmarks-to-runtime", "start_char": 6838, "end_char": 8967, "estimated_token_count": 418, "token_estimator": "heuristic-v1", "text": "### Add Benchmarks to Runtime\n\nBefore running the benchmarking tool, you must integrate benchmarks with your runtime as follows:\n\n1. Navigate to your `runtime/src` directory and check if a `benchmarks.rs` file exists. If not, create one. This file will contain the macro that registers all pallets for benchmarking along with their respective configurations:\n\n ```rust title=\"benchmarks.rs\"\n frame_benchmarking::define_benchmarks!(\n [frame_system, SystemBench::]\n [pallet_parachain_template, TemplatePallet]\n [pallet_balances, Balances]\n [pallet_session, SessionBench::]\n [pallet_timestamp, Timestamp]\n [pallet_message_queue, MessageQueue]\n [pallet_sudo, Sudo]\n [pallet_collator_selection, CollatorSelection]\n [cumulus_pallet_parachain_system, ParachainSystem]\n [cumulus_pallet_xcmp_queue, XcmpQueue]\n );\n ```\n\n For example, to add a new pallet named `pallet_parachain_template` for benchmarking, include it in the macro as shown:\n ```rust title=\"benchmarks.rs\" hl_lines=\"3\"\n frame_benchmarking::define_benchmarks!(\n [frame_system, SystemBench::]\n [pallet_parachain_template, TemplatePallet]\n );\n ```\n\n !!!warning \"Updating `define_benchmarks!` macro is required\"\n Any pallet that needs to be benchmarked must be included in the [`define_benchmarks!`](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/macro.define_benchmarks.html){target=\\_blank} macro. The CLI will only be able to access and benchmark pallets that are registered here.\n\n2. Check your runtime's `lib.rs` file to ensure the `benchmarks` module is imported. The import should look like this:\n\n ```rust title=\"lib.rs\"\n #[cfg(feature = \"runtime-benchmarks\")]\n mod benchmarks;\n ```\n\n The `runtime-benchmarks` feature gate ensures benchmark tests are isolated from production runtime code.\n\n3. Enable runtime benchmarking for your pallet in `runtime/Cargo.toml`:\n\n ```toml\n runtime-benchmarks = [\n # ...\n \"pallet_parachain_template/runtime-benchmarks\",\n ]\n\n ```"} +{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 7, "depth": 3, "title": "Run Benchmarks", "anchor": "run-benchmarks", "start_char": 8967, "end_char": 13352, "estimated_token_count": 1100, "token_estimator": "heuristic-v1", "text": "### Run Benchmarks\n\nYou can now compile your runtime with the `runtime-benchmarks` feature flag. This feature flag is crucial as the benchmarking tool will look for this feature being enabled to know when it should run benchmark tests. Follow these steps to compile the runtime with benchmarking enabled:\n\n1. Run `build` with the feature flag included:\n\n ```bash\n cargo build --features runtime-benchmarks --release\n ```\n\n2. Create a `weights.rs` file in your pallet's `src/` directory. This file will store the auto-generated weight calculations:\n\n ```bash\n touch weights.rs\n ```\n\n3. Before running the benchmarking tool, you'll need a template file that defines how weight information should be formatted. Download the official template from the Polkadot SDK repository and save it in your project folders for future use:\n\n ```bash\n curl https://raw.githubusercontent.com/paritytech/polkadot-sdk/refs/tags/polkadot-stable2412/substrate/.maintain/frame-weight-template.hbs \\\n --output ./pallets/benchmarking/frame-weight-template.hbs\n ```\n\n4. Run the benchmarking tool to measure extrinsic weights:\n\n ```bash\n frame-omni-bencher v1 benchmark pallet \\\n --runtime INSERT_PATH_TO_WASM_RUNTIME \\\n --pallet INSERT_NAME_OF_PALLET \\\n --extrinsic \"\" \\\n --template ./frame-weight-template.hbs \\\n --output weights.rs\n ```\n\n !!! tip \"Flag definitions\"\n - **`--runtime`**: The path to your runtime's Wasm.\n - **`--pallet`**: The name of the pallet you wish to benchmark. This pallet must be configured in your runtime and defined in `define_benchmarks`.\n - **`--extrinsic`**: Which extrinsic to test. Using `\"\"` implies all extrinsics will be benchmarked.\n - **`--template`**: Defines how weight information should be formatted.\n - **`--output`**: Where the output of the auto-generated weights will reside.\n\nThe generated `weights.rs` file contains weight annotations for your extrinsics, ready to be added to your pallet. The output should be similar to the following. Some output is omitted for brevity:\n\n
\n frame-omni-bencher v1 benchmark pallet \\\n --runtime INSERT_PATH_TO_WASM_RUNTIME \\\n --pallet \"INSERT_NAME_OF_PALLET\" \\\n --extrinsic \"\" \\\n --template ./frame-weight-template.hbs \\\n --output ./weights.rs\n ...\n 2025-01-15T16:41:33.557045Z INFO polkadot_sdk_frame::benchmark::pallet: [ 0 % ] Starting benchmark: pallet_parachain_template::do_something\n 2025-01-15T16:41:33.564644Z INFO polkadot_sdk_frame::benchmark::pallet: [ 50 % ] Starting benchmark: pallet_parachain_template::cause_error\n ...\n Created file: \"weights.rs\"\n \n
\n\n#### Add Benchmark Weights to Pallet\n\nOnce the `weights.rs` is generated, you must integrate it with your pallet. \n\n1. To begin the integration, import the `weights` module and the `WeightInfo` trait, then add both to your pallet's `Config` trait. Complete the following steps to set up the configuration:\n\n ```rust title=\"lib.rs\"\n pub mod weights;\n use crate::weights::WeightInfo;\n\n /// Configure the pallet by specifying the parameters and types on which it depends.\n #[pallet::config]\n pub trait Config: frame_system::Config {\n // ...\n /// A type representing the weights required by the dispatchables of this pallet.\n type WeightInfo: WeightInfo;\n }\n ```\n\n2. Next, you must add this to the `#[pallet::weight]` annotation in all the extrinsics via the `Config` as follows:\n\n ```rust hl_lines=\"2\" title=\"lib.rs\"\n #[pallet::call_index(0)]\n #[pallet::weight(T::WeightInfo::do_something())]\n pub fn do_something(origin: OriginFor) -> DispatchResultWithPostInfo { Ok(()) }\n ```\n\n3. Finally, configure the actual weight values in your runtime. In `runtime/src/config/mod.rs`, add the following code:\n\n ```rust title=\"mod.rs\"\n // Configure pallet.\n impl pallet_parachain_template::Config for Runtime {\n // ...\n type WeightInfo = pallet_parachain_template::weights::SubstrateWeight;\n }\n ```"} +{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 8, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 13352, "end_char": 13835, "estimated_token_count": 114, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n- View the Rust Docs for a more comprehensive, low-level view of the [FRAME V2 Benchmarking Suite](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/v2/index.html){target=_blank}.\n- Read the [FRAME Benchmarking and Weights](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/reference_docs/frame_benchmarking_weight/index.html){target=_blank} reference document, a concise guide which details how weights and benchmarking work."} {"page_id": "develop-parachains-testing-mock-runtime", "page_title": "Mock Runtime for Pallet Testing", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 16, "end_char": 474, "estimated_token_count": 78, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nTesting is essential in Polkadot SDK development to ensure your blockchain operates as intended and effectively handles various potential scenarios. This guide walks you through setting up an environment to test pallets within the [runtime](/polkadot-protocol/glossary#runtime){target=_blank}, allowing you to evaluate how different pallets, their configurations, and system components interact to ensure reliable blockchain functionality."} {"page_id": "develop-parachains-testing-mock-runtime", "page_title": "Mock Runtime for Pallet Testing", "index": 1, "depth": 2, "title": "Configuring a Mock Runtime", "anchor": "configuring-a-mock-runtime", "start_char": 474, "end_char": 505, "estimated_token_count": 6, "token_estimator": "heuristic-v1", "text": "## Configuring a Mock Runtime"} {"page_id": "develop-parachains-testing-mock-runtime", "page_title": "Mock Runtime for Pallet Testing", "index": 2, "depth": 3, "title": "Testing Module", "anchor": "testing-module", "start_char": 505, "end_char": 2264, "estimated_token_count": 348, "token_estimator": "heuristic-v1", "text": "### Testing Module\n\nThe mock runtime includes all the necessary pallets and configurations needed for testing. To ensure proper testing, you must create a module that integrates all components, enabling assessment of interactions between pallets and system elements.\n\nHere's a simple example of how to create a testing module that simulates these interactions:\n\n```rust\npub mod tests {\n use crate::*;\n // ...\n}\n```\n\nThe `crate::*;` snippet imports all the components from your crate (including runtime configurations, pallet modules, and utility functions) into the `tests` module. This allows you to write tests without manually importing each piece, making the code more concise and readable. You can opt to instead create a separate `mock.rs` file to define the configuration for your mock runtime and a companion `tests.rs` file to house the specific logic for each test.\n\nOnce the testing module is configured, you can craft your mock runtime using the [`frame_support::runtime`](https://paritytech.github.io/polkadot-sdk/master/frame_support/attr.runtime.html){target=\\_blank} macro. This macro allows you to define a runtime environment that will be created for testing purposes:\n\n```rust\npub mod tests {\n use crate::*;\n\n #[frame_support::runtime]\n mod runtime {\n #[runtime::runtime]\n #[runtime::derive(\n RuntimeCall,\n RuntimeEvent,\n RuntimeError,\n RuntimeOrigin,\n RuntimeFreezeReason,\n RuntimeHoldReason,\n RuntimeSlashReason,\n RuntimeLockId,\n RuntimeTask\n )]\n pub struct Test;\n\n #[runtime::pallet_index(0)]\n pub type System = frame_system::Pallet;\n\n // Other pallets...\n }\n}\n```"} @@ -288,7 +288,7 @@ {"page_id": "develop-parachains-testing-pallet-testing", "page_title": "Pallet Testing", "index": 5, "depth": 3, "title": "Event Testing", "anchor": "event-testing", "start_char": 4108, "end_char": 6129, "estimated_token_count": 519, "token_estimator": "heuristic-v1", "text": "### Event Testing\n\nIt's also crucial to test the events that your pallet emits during execution. By default, events generated in a pallet using the [`#generate_deposit`](https://paritytech.github.io/polkadot-sdk/master/frame_support/pallet_macros/attr.generate_deposit.html){target=\\_blank} macro are stored under the system's event storage key (system/events) as [`EventRecord`](https://paritytech.github.io/polkadot-sdk/master/frame_system/struct.EventRecord.html){target=\\_blank} entries. These can be accessed using [`System::events()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.events){target=\\_blank} or verified with specific helper methods provided by the system pallet, such as [`assert_has_event`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.assert_has_event){target=\\_blank} and [`assert_last_event`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.assert_last_event){target=\\_blank}.\n\nHere's an example of testing events in a mock runtime:\n\n```rust\n#[test]\nfn it_emits_events_on_success() {\n new_test_ext().execute_with(|| {\n // Call an extrinsic or function\n assert_ok!(TemplateModule::some_function(Origin::signed(1), valid_param));\n\n // Verify that the expected event was emitted\n assert!(System::events().iter().any(|record| {\n record.event == Event::TemplateModule(TemplateEvent::SomeEvent)\n }));\n });\n}\n```\n\nSome key considerations are:\n\n- **Block number**: Events are not emitted on the genesis block, so you need to set the block number using [`System::set_block_number()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.set_block_number){target=\\_blank} to ensure events are triggered.\n- **Converting events**: Use `.into()` when instantiating your pallet's event to convert it into a generic event type, as required by the system's event storage."} {"page_id": "develop-parachains-testing-pallet-testing", "page_title": "Pallet Testing", "index": 6, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 6129, "end_char": 6871, "estimated_token_count": 211, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n- Dive into the full implementation of the [`mock.rs`](https://github.com/paritytech/polkadot-sdk/blob/master/templates/solochain/pallets/template/src/mock.rs){target=\\_blank} and [`test.rs`](https://github.com/paritytech/polkadot-sdk/blob/master/templates/solochain/pallets/template/src/tests.rs){target=\\_blank} files in the [Solochain Template](https://github.com/paritytech/polkadot-sdk/tree/master/templates/solochain){target=_blank}.\n\n
\n\n- Guide __Benchmarking__\n\n ---\n\n Explore methods to measure the performance and execution cost of your pallet.\n\n [:octicons-arrow-right-24: Reference](/develop/parachains/testing/benchmarking)\n\n
"} {"page_id": "develop-parachains-testing", "page_title": "Testing Your Polkadot SDK-Based Blockchain", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 479, "end_char": 529, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-parachains-testing", "page_title": "Testing Your Polkadot SDK-Based Blockchain", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 529, "end_char": 1243, "estimated_token_count": 199, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-parachains-testing", "page_title": "Testing Your Polkadot SDK-Based Blockchain", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 529, "end_char": 1221, "estimated_token_count": 193, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-parachains", "page_title": "Parachains", "index": 0, "depth": 2, "title": "Building Parachains with the Polkadot SDK", "anchor": "building-parachains-with-the-polkadot-sdk", "start_char": 275, "end_char": 1983, "estimated_token_count": 329, "token_estimator": "heuristic-v1", "text": "## Building Parachains with the Polkadot SDK\n\nWith the [Polkadot relay chain](/polkadot-protocol/architecture/polkadot-chain/){target=\\_blank} handling security and consensus, parachain developers are free to focus on features such as asset management, governance, and cross-chain communication. The Polkadot SDK equips developers with the tools to build, deploy, and maintain efficient, scalable parachains.\n\nPolkadot SDK’s FRAME framework provides developers with the tools to do the following:\n\n- **Customize parachain runtimes**: [Runtimes](/polkadot-protocol/glossary/#runtime){target=\\_blank} are the core building blocks that define the logic and functionality of Polkadot SDK-based parachains and let developers customize the parameters, rules, and behaviors that shape their blockchain network.\n- **Develop new pallets**: Create custom modular pallets to define runtime behavior and achieve desired blockchain functionality.\n- **Add smart contract functionality**: Use specialized pallets to deploy and execute smart contracts, enhancing your chain's functionality and programmability.\n- **Test your build for a confident deployment**: Create a test environment that can simulate runtime and mock transaction execution.\n- **Deploy your blockchain for use**: Take your Polkadot SDK-based blockchain from a local environment to production.\n- **Maintain your network including monitoring and upgrades**: Runtimes can be upgraded through forkless runtime updates, enabling seamless evolution of the parachain.\n\nNew to parachain development? Start with the [Introduction to the Polkadot SDK](/develop/parachains/intro-polkadot-sdk/) to discover how this framework simplifies building custom parachains."} {"page_id": "develop-parachains", "page_title": "Parachains", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1983, "end_char": 2032, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "develop-smart-contracts-block-explorers", "page_title": "Block Explorers", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 189, "end_char": 497, "estimated_token_count": 49, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nBlock explorers serve as comprehensive blockchain analytics platforms that provide access to on-chain data. These web applications function as search engines for blockchain networks, allowing users to query, visualize, and analyze blockchain data in real time through intuitive interfaces."} @@ -465,15 +465,15 @@ {"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 199, "end_char": 721, "estimated_token_count": 78, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nPrecompiles offer Polkadot Hub developers access to high-performance native functions directly from their smart contracts. Each precompile has a specific address and accepts a particular input data format. When called correctly, they execute optimized, native implementations of commonly used functions much more efficiently than equivalent contract-based implementations.\n\nThis guide demonstrates how to interact with each standard precompile available in Polkadot Hub through Solidity smart contracts."} {"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 1, "depth": 2, "title": "Basic Precompile Interaction Pattern", "anchor": "basic-precompile-interaction-pattern", "start_char": 721, "end_char": 1661, "estimated_token_count": 194, "token_estimator": "heuristic-v1", "text": "## Basic Precompile Interaction Pattern\n\nAll precompiles follow a similar interaction pattern:\n\n```solidity\n// Generic pattern for calling precompiles\nfunction callPrecompile(address precompileAddress, bytes memory input)\n internal\n returns (bool success, bytes memory result)\n{\n // Direct low-level call to the precompile address\n (success, result) = precompileAddress.call(input);\n\n // Ensure the call was successful\n require(success, \"Precompile call failed\");\n\n return (success, result);\n}\n```\n\nFeel free to check the [`precompiles-hardhat`](https://github.com/polkadot-developers/polkavm-hardhat-examples/tree/v0.0.3/precompiles-hardhat){target=\\_blank} repository to check all the precompiles examples. The repository contains a set of example contracts and test files demonstrating how to interact with each precompile in Polkadot Hub.\n\nNow, you'll explore how to use each precompile available in Polkadot Hub."} {"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 2, "depth": 2, "title": "ECRecover (0x01)", "anchor": "ecrecover-0x01", "start_char": 1661, "end_char": 3161, "estimated_token_count": 325, "token_estimator": "heuristic-v1", "text": "## ECRecover (0x01)\n\nECRecover recovers an Ethereum address associated with the public key used to sign a message.\n\n```solidity title=\"ECRecover.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract ECRecoverExample {\n event ECRecovered(bytes result);\n\n // Address of the ECRecover precompile\n address constant EC_RECOVER_ADDRESS = address(0x01);\n bytes public result;\n\n function callECRecover(bytes calldata input) public {\n bool success;\n bytes memory resultInMemory;\n\n (success, resultInMemory) = EC_RECOVER_ADDRESS.call{value: 0}(input);\n\n if (success) {\n emit ECRecovered(resultInMemory);\n }\n\n result = resultInMemory;\n }\n\n function getRecoveredAddress() public view returns (address) {\n require(result.length == 32, \"Invalid result length\");\n return address(uint160(uint256(bytes32(result))));\n }\n}\n```\n\nTo interact with the ECRecover precompile, you can deploy the `ECRecoverExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment. The `callECRecover` function takes a 128-byte input combining the message `hash`, `v`, `r`, and `s` signature values. Check this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/ECRecover.js){target=\\_blank} that shows how to format this input and verify that the recovered address matches the expected result."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 3, "depth": 2, "title": "SHA-256 (0x02)", "anchor": "sha-256-0x02", "start_char": 3161, "end_char": 4399, "estimated_token_count": 294, "token_estimator": "heuristic-v1", "text": "## SHA-256 (0x02)\n\nThe SHA-256 precompile computes the SHA-256 hash of the input data.\n\n```solidity title=\"SHA256.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract SHA256Example {\n event SHA256Called(bytes result);\n\n // Address of the SHA256 precompile\n address constant SHA256_PRECOMPILE = address(0x02);\n\n bytes public result;\n\n function callH256(bytes calldata input) public {\n bool success;\n bytes memory resultInMemory;\n\n (success, resultInMemory) = SHA256_PRECOMPILE.call{value: 0}(input);\n\n if (success) {\n emit SHA256Called(resultInMemory);\n }\n\n result = resultInMemory;\n }\n}\n```\n\nTo use it, you can deploy the `SHA256Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call callH256 with arbitrary bytes. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/SHA256.js){target=\\_blank} shows how to pass a UTF-8 string, hash it using the precompile, and compare it with the expected hash from Node.js's [crypto](https://www.npmjs.com/package/crypto-js){target=\\_blank} module."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 4, "depth": 2, "title": "RIPEMD-160 (0x03)", "anchor": "ripemd-160-0x03", "start_char": 4399, "end_char": 5788, "estimated_token_count": 299, "token_estimator": "heuristic-v1", "text": "## RIPEMD-160 (0x03)\n\nThe RIPEMD-160 precompile computes the RIPEMD-160 hash of the input data.\n\n```solidity title=\"RIPEMD160.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract RIPEMD160Example {\n // RIPEMD-160 precompile address\n address constant RIPEMD160_PRECOMPILE = address(0x03);\n\n bytes32 public result;\n\n event RIPEMD160Called(bytes32 result);\n\n function calculateRIPEMD160(bytes calldata input) public returns (bytes32) {\n (bool success, bytes memory returnData) = RIPEMD160_PRECOMPILE.call(\n input\n );\n require(success, \"RIPEMD-160 precompile call failed\");\n // return full 32 bytes, no assembly extraction\n bytes32 fullHash;\n assembly {\n fullHash := mload(add(returnData, 32))\n }\n result = fullHash;\n emit RIPEMD160Called(fullHash);\n return fullHash;\n }\n}\n```\n\nTo use it, you can deploy the `RIPEMD160Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `calculateRIPEMD160` with arbitrary bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/RIPEMD160.js){target=\\_blank} shows how to hash a UTF-8 string, pad the 20-byte result to 32 bytes, and verify it against the expected output."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 5, "depth": 2, "title": "Identity (Data Copy) (0x04)", "anchor": "identity-data-copy-0x04", "start_char": 5788, "end_char": 7024, "estimated_token_count": 259, "token_estimator": "heuristic-v1", "text": "## Identity (Data Copy) (0x04)\n\nThe Identity precompile simply returns the input data as output. While seemingly trivial, it can be useful for testing and certain specialized scenarios.\n\n```solidity title=\"Identity.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract IdentityExample {\n event IdentityCalled(bytes result);\n\n // Address of the Identity precompile\n address constant IDENTITY_PRECOMPILE = address(0x04);\n\n bytes public result;\n\n function callIdentity(bytes calldata input) public {\n bool success;\n bytes memory resultInMemory;\n\n (success, resultInMemory) = IDENTITY_PRECOMPILE.call(input);\n\n if (success) {\n emit IdentityCalled(resultInMemory);\n }\n\n result = resultInMemory;\n }\n}\n```\n\nTo use it, you can deploy the `IdentityExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callIdentity` with arbitrary bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Identity.js){target=\\_blank} shows how to pass input data and verify that the precompile returns it unchanged."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 6, "depth": 2, "title": "Modular Exponentiation (0x05)", "anchor": "modular-exponentiation-0x05", "start_char": 7024, "end_char": 8506, "estimated_token_count": 309, "token_estimator": "heuristic-v1", "text": "## Modular Exponentiation (0x05)\n\nThe ModExp precompile performs modular exponentiation, which is an operation commonly needed in cryptographic algorithms.\n\n```solidity title=\"ModExp.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract ModExpExample {\n address constant MODEXP_ADDRESS = address(0x05);\n\n function modularExponentiation(\n bytes memory base,\n bytes memory exponent,\n bytes memory modulus\n ) public view returns (bytes memory) {\n bytes memory input = abi.encodePacked(\n toBytes32(base.length),\n toBytes32(exponent.length),\n toBytes32(modulus.length),\n base,\n exponent,\n modulus\n );\n\n (bool success, bytes memory result) = MODEXP_ADDRESS.staticcall(input);\n require(success, \"ModExp precompile call failed\");\n\n return result;\n }\n\n function toBytes32(uint256 value) internal pure returns (bytes32) {\n return bytes32(value);\n }\n}\n```\n\nTo use it, you can deploy the `ModExpExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `modularExponentiation` with encoded `base`, `exponent`, and `modulus` bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/ModExp.js){target=\\_blank} shows how to test modular exponentiation like (4 ** 13) % 497 = 445."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 7, "depth": 2, "title": "BN128 Addition (0x06)", "anchor": "bn128-addition-0x06", "start_char": 8506, "end_char": 10020, "estimated_token_count": 343, "token_estimator": "heuristic-v1", "text": "## BN128 Addition (0x06)\n\nThe BN128Add precompile performs addition on the alt_bn128 elliptic curve, which is essential for zk-SNARK operations.\n\n```solidity title=\"BN128Add.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.20;\n\ncontract BN128AddExample {\n address constant BN128_ADD_PRECOMPILE = address(0x06);\n\n event BN128Added(uint256 x3, uint256 y3);\n\n uint256 public resultX;\n uint256 public resultY;\n\n function callBN128Add(uint256 x1, uint256 y1, uint256 x2, uint256 y2) public {\n bytes memory input = abi.encodePacked(\n bytes32(x1), bytes32(y1), bytes32(x2), bytes32(y2)\n );\n\n bool success;\n bytes memory output;\n\n (success, output) = BN128_ADD_PRECOMPILE.call{value: 0}(input);\n\n require(success, \"BN128Add precompile call failed\");\n require(output.length == 64, \"Invalid output length\");\n\n (uint256 x3, uint256 y3) = abi.decode(output, (uint256, uint256));\n\n resultX = x3;\n resultY = y3;\n\n emit BN128Added(x3, y3);\n }\n}\n```\n\nTo use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callBN128Add` with valid `alt_bn128` points. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Add.js){target=\\_blank} demonstrates a valid curve addition and checks the result against known expected values."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 8, "depth": 2, "title": "BN128 Scalar Multiplication (0x07)", "anchor": "bn128-scalar-multiplication-0x07", "start_char": 10020, "end_char": 11756, "estimated_token_count": 369, "token_estimator": "heuristic-v1", "text": "## BN128 Scalar Multiplication (0x07)\n\nThe BN128Mul precompile performs scalar multiplication on the alt_bn128 curve.\n\n```solidity title=\"BN128Mul.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract BN128MulExample {\n // Precompile address for BN128Mul\n address constant BN128_MUL_ADDRESS = address(0x07);\n\n bytes public result;\n\n // Performs scalar multiplication of a point on the alt_bn128 curve\n function bn128ScalarMul(uint256 x1, uint256 y1, uint256 scalar) public {\n // Format: [x, y, scalar] - each 32 bytes\n bytes memory input = abi.encodePacked(\n bytes32(x1),\n bytes32(y1),\n bytes32(scalar)\n );\n\n (bool success, bytes memory resultInMemory) = BN128_MUL_ADDRESS.call{\n value: 0\n }(input);\n require(success, \"BN128Mul precompile call failed\");\n\n result = resultInMemory;\n }\n\n // Helper to decode result from `result` storage\n function getResult() public view returns (uint256 x2, uint256 y2) {\n bytes memory tempResult = result;\n require(tempResult.length >= 64, \"Invalid result length\");\n assembly {\n x2 := mload(add(tempResult, 32))\n y2 := mload(add(tempResult, 64))\n }\n }\n}\n```\n\nTo use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `bn128ScalarMul` with a valid point and scalar. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Mul.js){target=\\_blank} shows how to test the operation and verify the expected scalar multiplication result on `alt_bn128`."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 9, "depth": 2, "title": "BN128 Pairing Check (0x08)", "anchor": "bn128-pairing-check-0x08", "start_char": 11756, "end_char": 13264, "estimated_token_count": 313, "token_estimator": "heuristic-v1", "text": "## BN128 Pairing Check (0x08)\n\nThe BN128Pairing precompile verifies a pairing equation on the alt_bn128 curve, which is critical for zk-SNARK verification.\n\n```solidity title=\"BN128Pairing.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract BN128PairingExample {\n // Precompile address for BN128Pairing\n address constant BN128_PAIRING_ADDRESS = address(0x08);\n\n bytes public result;\n\n // Performs a pairing check on the alt_bn128 curve\n function bn128Pairing(bytes memory input) public {\n // Call the precompile\n (bool success, bytes memory resultInMemory) = BN128_PAIRING_ADDRESS\n .call{value: 0}(input);\n require(success, \"BN128Pairing precompile call failed\");\n\n result = resultInMemory;\n }\n\n // Helper function to decode the result from `result` storage\n function getResult() public view returns (bool isValid) {\n bytes memory tempResult = result;\n require(tempResult.length == 32, \"Invalid result length\");\n\n uint256 output;\n assembly {\n output := mload(add(tempResult, 32))\n }\n\n isValid = (output == 1);\n }\n}\n```\n\nYou can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or your preferred environment. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Pairing.js){target=\\_blank} contains these tests with working examples."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 10, "depth": 2, "title": "Blake2F (0x09)", "anchor": "blake2f-0x09", "start_char": 13264, "end_char": 17391, "estimated_token_count": 945, "token_estimator": "heuristic-v1", "text": "## Blake2F (0x09)\n\nThe Blake2F precompile performs the Blake2 compression function F, which is the core of the Blake2 hash function.\n\n```solidity title=\"Blake2F.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract Blake2FExample {\n // Precompile address for Blake2F\n address constant BLAKE2F_ADDRESS = address(0x09);\n\n bytes public result;\n\n function blake2F(bytes memory input) public {\n // Input must be exactly 213 bytes\n require(input.length == 213, \"Invalid input length - must be 213 bytes\");\n\n // Call the precompile\n (bool success, bytes memory resultInMemory) = BLAKE2F_ADDRESS.call{\n value: 0\n }(input);\n require(success, \"Blake2F precompile call failed\");\n\n result = resultInMemory;\n }\n\n // Helper function to decode the result from `result` storage\n function getResult() public view returns (bytes32[8] memory output) {\n bytes memory tempResult = result;\n require(tempResult.length == 64, \"Invalid result length\");\n\n for (uint i = 0; i < 8; i++) {\n assembly {\n mstore(add(output, mul(32, i)), mload(add(add(tempResult, 32), mul(32, i))))\n }\n }\n }\n\n\n // Helper function to create Blake2F input from parameters\n function createBlake2FInput(\n uint32 rounds,\n bytes32[8] memory h,\n bytes32[16] memory m,\n bytes8[2] memory t,\n bool f\n ) public pure returns (bytes memory) {\n // Start with rounds (4 bytes, big-endian)\n bytes memory input = abi.encodePacked(rounds);\n\n // Add state vector h (8 * 32 = 256 bytes)\n for (uint i = 0; i < 8; i++) {\n input = abi.encodePacked(input, h[i]);\n }\n\n // Add message block m (16 * 32 = 512 bytes, but we need to convert to 16 * 8 = 128 bytes)\n // Blake2F expects 64-bit words in little-endian format\n for (uint i = 0; i < 16; i++) {\n // Take only the first 8 bytes of each bytes32 and reverse for little-endian\n bytes8 word = bytes8(m[i]);\n input = abi.encodePacked(input, word);\n }\n\n // Add offset counters t (2 * 8 = 16 bytes)\n input = abi.encodePacked(input, t[0], t[1]);\n\n // Add final block flag (1 byte)\n input = abi.encodePacked(input, f ? bytes1(0x01) : bytes1(0x00));\n\n return input;\n }\n\n // Simplified function that works with raw hex input\n function blake2FFromHex(string memory hexInput) public {\n bytes memory input = hexStringToBytes(hexInput);\n blake2F(input);\n }\n\n // Helper function to convert hex string to bytes\n function hexStringToBytes(string memory hexString) public pure returns (bytes memory) {\n bytes memory hexBytes = bytes(hexString);\n require(hexBytes.length % 2 == 0, \"Invalid hex string length\");\n \n bytes memory result = new bytes(hexBytes.length / 2);\n \n for (uint i = 0; i < hexBytes.length / 2; i++) {\n result[i] = bytes1(\n (hexCharToByte(hexBytes[2 * i]) << 4) | \n hexCharToByte(hexBytes[2 * i + 1])\n );\n }\n \n return result;\n }\n\n function hexCharToByte(bytes1 char) internal pure returns (uint8) {\n uint8 c = uint8(char);\n if (c >= 48 && c <= 57) return c - 48; // 0-9\n if (c >= 65 && c <= 70) return c - 55; // A-F\n if (c >= 97 && c <= 102) return c - 87; // a-f\n revert(\"Invalid hex character\");\n }\n}\n```\n\nTo use it, deploy `Blake2FExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callBlake2F` with the properly formatted input parameters for rounds, state vector, message block, offset counters, and final block flag. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Blake2.js){target=\\_blank} demonstrates how to perform Blake2 compression with different rounds and verify the correctness of the output against known test vectors."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 11, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 17391, "end_char": 18009, "estimated_token_count": 92, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nPrecompiles in Polkadot Hub provide efficient, native implementations of cryptographic functions and other commonly used operations. By understanding how to interact with these precompiles from your Solidity contracts, you can build more efficient and feature-rich applications on the Polkadot ecosystem.\n\nThe examples provided in this guide demonstrate the basic patterns for interacting with each precompile. Developers can adapt these patterns to their specific use cases, leveraging the performance benefits of native implementations while maintaining the flexibility of smart contract development."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 3, "depth": 2, "title": "SHA-256 (0x02)", "anchor": "sha-256-0x02", "start_char": 3161, "end_char": 3843, "estimated_token_count": 194, "token_estimator": "heuristic-v1", "text": "## SHA-256 (0x02)\n\nThe SHA-256 precompile computes the SHA-256 hash of the input data.\n\n```solidity title=\"SHA256.sol\"\n\n```\n\nTo use it, you can deploy the `SHA256Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call callH256 with arbitrary bytes. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/SHA256.js){target=\\_blank} shows how to pass a UTF-8 string, hash it using the precompile, and compare it with the expected hash from Node.js's [crypto](https://www.npmjs.com/package/crypto-js){target=\\_blank} module."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 4, "depth": 2, "title": "RIPEMD-160 (0x03)", "anchor": "ripemd-160-0x03", "start_char": 3843, "end_char": 5232, "estimated_token_count": 299, "token_estimator": "heuristic-v1", "text": "## RIPEMD-160 (0x03)\n\nThe RIPEMD-160 precompile computes the RIPEMD-160 hash of the input data.\n\n```solidity title=\"RIPEMD160.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract RIPEMD160Example {\n // RIPEMD-160 precompile address\n address constant RIPEMD160_PRECOMPILE = address(0x03);\n\n bytes32 public result;\n\n event RIPEMD160Called(bytes32 result);\n\n function calculateRIPEMD160(bytes calldata input) public returns (bytes32) {\n (bool success, bytes memory returnData) = RIPEMD160_PRECOMPILE.call(\n input\n );\n require(success, \"RIPEMD-160 precompile call failed\");\n // return full 32 bytes, no assembly extraction\n bytes32 fullHash;\n assembly {\n fullHash := mload(add(returnData, 32))\n }\n result = fullHash;\n emit RIPEMD160Called(fullHash);\n return fullHash;\n }\n}\n```\n\nTo use it, you can deploy the `RIPEMD160Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `calculateRIPEMD160` with arbitrary bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/RIPEMD160.js){target=\\_blank} shows how to hash a UTF-8 string, pad the 20-byte result to 32 bytes, and verify it against the expected output."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 5, "depth": 2, "title": "Identity (Data Copy) (0x04)", "anchor": "identity-data-copy-0x04", "start_char": 5232, "end_char": 6468, "estimated_token_count": 259, "token_estimator": "heuristic-v1", "text": "## Identity (Data Copy) (0x04)\n\nThe Identity precompile simply returns the input data as output. While seemingly trivial, it can be useful for testing and certain specialized scenarios.\n\n```solidity title=\"Identity.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract IdentityExample {\n event IdentityCalled(bytes result);\n\n // Address of the Identity precompile\n address constant IDENTITY_PRECOMPILE = address(0x04);\n\n bytes public result;\n\n function callIdentity(bytes calldata input) public {\n bool success;\n bytes memory resultInMemory;\n\n (success, resultInMemory) = IDENTITY_PRECOMPILE.call(input);\n\n if (success) {\n emit IdentityCalled(resultInMemory);\n }\n\n result = resultInMemory;\n }\n}\n```\n\nTo use it, you can deploy the `IdentityExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callIdentity` with arbitrary bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Identity.js){target=\\_blank} shows how to pass input data and verify that the precompile returns it unchanged."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 6, "depth": 2, "title": "Modular Exponentiation (0x05)", "anchor": "modular-exponentiation-0x05", "start_char": 6468, "end_char": 7950, "estimated_token_count": 309, "token_estimator": "heuristic-v1", "text": "## Modular Exponentiation (0x05)\n\nThe ModExp precompile performs modular exponentiation, which is an operation commonly needed in cryptographic algorithms.\n\n```solidity title=\"ModExp.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract ModExpExample {\n address constant MODEXP_ADDRESS = address(0x05);\n\n function modularExponentiation(\n bytes memory base,\n bytes memory exponent,\n bytes memory modulus\n ) public view returns (bytes memory) {\n bytes memory input = abi.encodePacked(\n toBytes32(base.length),\n toBytes32(exponent.length),\n toBytes32(modulus.length),\n base,\n exponent,\n modulus\n );\n\n (bool success, bytes memory result) = MODEXP_ADDRESS.staticcall(input);\n require(success, \"ModExp precompile call failed\");\n\n return result;\n }\n\n function toBytes32(uint256 value) internal pure returns (bytes32) {\n return bytes32(value);\n }\n}\n```\n\nTo use it, you can deploy the `ModExpExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `modularExponentiation` with encoded `base`, `exponent`, and `modulus` bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/ModExp.js){target=\\_blank} shows how to test modular exponentiation like (4 ** 13) % 497 = 445."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 7, "depth": 2, "title": "BN128 Addition (0x06)", "anchor": "bn128-addition-0x06", "start_char": 7950, "end_char": 8599, "estimated_token_count": 157, "token_estimator": "heuristic-v1", "text": "## BN128 Addition (0x06)\n\nThe BN128Add precompile performs addition on the alt_bn128 elliptic curve, which is essential for zk-SNARK operations.\n\n```solidity title=\"BN128Add.sol\"\n\n```\n\nTo use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callBN128Add` with valid `alt_bn128` points. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Add.js){target=\\_blank} demonstrates a valid curve addition and checks the result against known expected values."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 8, "depth": 2, "title": "BN128 Scalar Multiplication (0x07)", "anchor": "bn128-scalar-multiplication-0x07", "start_char": 8599, "end_char": 9214, "estimated_token_count": 149, "token_estimator": "heuristic-v1", "text": "## BN128 Scalar Multiplication (0x07)\n\nThe BN128Mul precompile performs scalar multiplication on the alt_bn128 curve.\n\n```solidity title=\"BN128Mul.sol\"\n\n```\n\nTo use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `bn128ScalarMul` with a valid point and scalar. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Mul.js){target=\\_blank} shows how to test the operation and verify the expected scalar multiplication result on `alt_bn128`."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 9, "depth": 2, "title": "BN128 Pairing Check (0x08)", "anchor": "bn128-pairing-check-0x08", "start_char": 9214, "end_char": 9764, "estimated_token_count": 135, "token_estimator": "heuristic-v1", "text": "## BN128 Pairing Check (0x08)\n\nThe BN128Pairing precompile verifies a pairing equation on the alt_bn128 curve, which is critical for zk-SNARK verification.\n\n```solidity title=\"BN128Pairing.sol\"\n\n```\n\nYou can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or your preferred environment. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Pairing.js){target=\\_blank} contains these tests with working examples."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 10, "depth": 2, "title": "Blake2F (0x09)", "anchor": "blake2f-0x09", "start_char": 9764, "end_char": 10518, "estimated_token_count": 175, "token_estimator": "heuristic-v1", "text": "## Blake2F (0x09)\n\nThe Blake2F precompile performs the Blake2 compression function F, which is the core of the Blake2 hash function.\n\n```solidity title=\"Blake2F.sol\"\n\n```\n\nTo use it, deploy `Blake2FExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callBlake2F` with the properly formatted input parameters for rounds, state vector, message block, offset counters, and final block flag. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Blake2.js){target=\\_blank} demonstrates how to perform Blake2 compression with different rounds and verify the correctness of the output against known test vectors."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 11, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 10518, "end_char": 11136, "estimated_token_count": 92, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nPrecompiles in Polkadot Hub provide efficient, native implementations of cryptographic functions and other commonly used operations. By understanding how to interact with these precompiles from your Solidity contracts, you can build more efficient and feature-rich applications on the Polkadot ecosystem.\n\nThe examples provided in this guide demonstrate the basic patterns for interacting with each precompile. Developers can adapt these patterns to their specific use cases, leveraging the performance benefits of native implementations while maintaining the flexibility of smart contract development."} {"page_id": "develop-smart-contracts-precompiles-xcm-precompile", "page_title": "Interact with the XCM Precompile", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 18, "end_char": 913, "estimated_token_count": 191, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nThe [XCM (Cross-Consensus Message)](/develop/interoperability/intro-to-xcm){target=\\_blank} precompile enables Polkadot Hub developers to access XCM functionality directly from their smart contracts using a Solidity interface.\n\nLocated at the fixed address `0x00000000000000000000000000000000000a0000`, the XCM precompile offers three primary functions:\n\n- **`execute`**: For local XCM execution.\n- **`send`**: For cross-chain message transmission.\n- **`weighMessage`**: For cost estimation.\n\nThis guide demonstrates how to interact with the XCM precompile through Solidity smart contracts using [Remix IDE](/develop/smart-contracts/dev-environments/remix){target=\\_blank}.\n\n!!!note\n The XCM precompile provides the barebones XCM functionality. While it provides a lot of flexibility, it doesn't provide abstractions to hide away XCM details. These have to be built on top."} {"page_id": "develop-smart-contracts-precompiles-xcm-precompile", "page_title": "Interact with the XCM Precompile", "index": 1, "depth": 2, "title": "Precompile Interface", "anchor": "precompile-interface", "start_char": 913, "end_char": 4064, "estimated_token_count": 708, "token_estimator": "heuristic-v1", "text": "## Precompile Interface\n\nThe XCM precompile implements the `IXcm` interface, which defines the structure for interacting with XCM functionality. The source code for the interface is as follows:\n\n```solidity title=\"IXcm.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.20;\n\n/// @dev The on-chain address of the XCM (Cross-Consensus Messaging) precompile.\naddress constant XCM_PRECOMPILE_ADDRESS = address(0xA0000);\n\n/// @title XCM Precompile Interface\n/// @notice A low-level interface for interacting with `pallet_xcm`.\n/// It forwards calls directly to the corresponding dispatchable functions,\n/// providing access to XCM execution and message passing.\n/// @dev Documentation:\n/// @dev - XCM: https://docs.polkadot.com/develop/interoperability\n/// @dev - SCALE codec: https://docs.polkadot.com/polkadot-protocol/parachain-basics/data-encoding\n/// @dev - Weights: https://docs.polkadot.com/polkadot-protocol/parachain-basics/blocks-transactions-fees/fees/#transactions-weights-and-fees\ninterface IXcm {\n /// @notice Weight v2 used for measurement for an XCM execution\n struct Weight {\n /// @custom:property The computational time used to execute some logic based on reference hardware.\n uint64 refTime;\n /// @custom:property The size of the proof needed to execute some logic.\n uint64 proofSize;\n }\n\n /// @notice Executes an XCM message locally on the current chain with the caller's origin.\n /// @dev Internally calls `pallet_xcm::execute`.\n /// @param message A SCALE-encoded Versioned XCM message.\n /// @param weight The maximum allowed `Weight` for execution.\n /// @dev Call @custom:function weighMessage(message) to ensure sufficient weight allocation.\n function execute(bytes calldata message, Weight calldata weight) external;\n\n /// @notice Sends an XCM message to another parachain or consensus system.\n /// @dev Internally calls `pallet_xcm::send`.\n /// @param destination SCALE-encoded destination MultiLocation.\n /// @param message SCALE-encoded Versioned XCM message.\n function send(bytes calldata destination, bytes calldata message) external;\n\n /// @notice Estimates the `Weight` required to execute a given XCM message.\n /// @param message SCALE-encoded Versioned XCM message to analyze.\n /// @return weight Struct containing estimated `refTime` and `proofSize`.\n function weighMessage(bytes calldata message) external view returns (Weight memory weight);\n}\n```\n\nThe interface defines a `Weight` struct that represents the computational cost of XCM operations. Weight has two components: \n\n- **`refTime`**: Computational time on reference hardware.\n- **`proofSize`**: The size of the proof required for execution.\n\nAll XCM messages must be encoded using the [SCALE codec](/polkadot-protocol/parachain-basics/data-encoding/#data-encoding){target=\\_blank}, Polkadot's standard serialization format.\n\nFor further information, check the [`precompiles/IXCM.sol`](https://github.com/paritytech/polkadot-sdk/blob/cb629d46ebf00aa65624013a61f9c69ebf02b0b4/polkadot/xcm/pallet-xcm/src/precompiles/IXcm.sol){target=\\_blank} file present in `pallet-xcm`."} {"page_id": "develop-smart-contracts-precompiles-xcm-precompile", "page_title": "Interact with the XCM Precompile", "index": 2, "depth": 2, "title": "Interact with the XCM Precompile", "anchor": "interact-with-the-xcm-precompile", "start_char": 4064, "end_char": 5303, "estimated_token_count": 306, "token_estimator": "heuristic-v1", "text": "## Interact with the XCM Precompile\n\nTo interact with the XCM precompile, you can use the precompile interface directly in [Remix IDE](/develop/smart-contracts/dev-environments/remix/){target=\\_blank}:\n\n1. Create a new file called `IXcm.sol` in Remix.\n2. Copy and paste the `IXcm` interface code into the file.\n3. Compile the interface by selecting the button or using **Ctrl +S** keys:\n\n ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-01.webp)\n\n4. In the **Deploy & Run Transactions** tab, select the `IXcm` interface from the contract dropdown.\n5. Enter the precompile address `0x00000000000000000000000000000000000a0000` in the **At Address** input field.\n6. Select the **At Address** button to connect to the precompile.\n\n ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-02.webp)\n\n7. Once connected, you can use the Remix interface to interact with the XCM precompile's `execute`, `send`, and `weighMessage` functions.\n\n ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-03.webp)\n\nThe main entrypoint of the precompile is the `execute` function. However, it's necessary to first call `weighMessage` to fill in the required parameters."} @@ -553,7 +553,7 @@ {"page_id": "develop-toolkit-api-libraries-subxt", "page_title": "Subxt Rust API", "index": 8, "depth": 3, "title": "Submit Transactions", "anchor": "submit-transactions", "start_char": 7470, "end_char": 8944, "estimated_token_count": 311, "token_estimator": "heuristic-v1", "text": "### Submit Transactions\n\nTo submit a transaction, you must construct an extrinsic, sign it with your private key, and send it to the blockchain. Replace `INSERT_DEST_ADDRESS` with the recipient's address, `INSERT_AMOUNT` with the amount to transfer, and `INSERT_SECRET_PHRASE` with the sender's mnemonic phrase:\n\n```rust\n // Define the recipient address and transfer amount.\n const DEST_ADDRESS: &str = \"INSERT_DEST_ADDRESS\";\n const AMOUNT: u128 = INSERT_AMOUNT;\n\n // Convert the recipient address into an `AccountId32`.\n let dest = AccountId32::from_str(DEST_ADDRESS).unwrap();\n\n // Build the balance transfer extrinsic.\n let balance_transfer_tx = polkadot::tx()\n .balances()\n .transfer_allow_death(dest.into(), AMOUNT);\n\n // Load the sender's keypair from a mnemonic phrase.\n const SECRET_PHRASE: &str = \"INSERT_SECRET_PHRASE\";\n let mnemonic = Mnemonic::parse(SECRET_PHRASE).unwrap();\n let sender_keypair = Keypair::from_phrase(&mnemonic, None).unwrap();\n\n // Sign and submit the extrinsic, then wait for it to be finalized.\n let events = api\n .tx()\n .sign_and_submit_then_watch_default(&balance_transfer_tx, &sender_keypair)\n .await?\n .wait_for_finalized_success()\n .await?;\n\n // Check for a successful transfer event.\n if let Some(event) = events.find_first::()? {\n println!(\"Balance transfer successful: {:?}\", event);\n }\n```"} {"page_id": "develop-toolkit-api-libraries-subxt", "page_title": "Subxt Rust API", "index": 9, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 8944, "end_char": 9174, "estimated_token_count": 57, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\nNow that you've covered the basics dive into the official [subxt documentation](https://docs.rs/subxt/latest/subxt/book/index.html){target=\\_blank} for comprehensive reference materials and advanced features."} {"page_id": "develop-toolkit-api-libraries", "page_title": "API Libraries", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 392, "end_char": 442, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-toolkit-api-libraries", "page_title": "API Libraries", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 442, "end_char": 1096, "estimated_token_count": 179, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-toolkit-api-libraries", "page_title": "API Libraries", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 442, "end_char": 1074, "estimated_token_count": 173, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-toolkit-integrations-indexers", "page_title": "Indexers", "index": 0, "depth": 2, "title": "The Challenge of Blockchain Data Access", "anchor": "the-challenge-of-blockchain-data-access", "start_char": 12, "end_char": 649, "estimated_token_count": 103, "token_estimator": "heuristic-v1", "text": "## The Challenge of Blockchain Data Access\n\nBlockchain data is inherently sequential and distributed, with information stored chronologically across numerous blocks. While retrieving data from a single block through JSON-RPC API calls is straightforward, more complex queries that span multiple blocks present significant challenges:\n\n- Data is scattered and unorganized across the blockchain.\n- Retrieving large datasets can take days or weeks to sync.\n- Complex operations (like aggregations, averages, or cross-chain queries) require additional processing.\n- Direct blockchain queries can impact dApp performance and responsiveness."} {"page_id": "develop-toolkit-integrations-indexers", "page_title": "Indexers", "index": 1, "depth": 2, "title": "What is a Blockchain Indexer?", "anchor": "what-is-a-blockchain-indexer", "start_char": 649, "end_char": 1211, "estimated_token_count": 108, "token_estimator": "heuristic-v1", "text": "## What is a Blockchain Indexer?\n\nA blockchain indexer is a specialized infrastructure tool that processes, organizes, and stores blockchain data in an optimized format for efficient querying. Think of it as a search engine for blockchain data that:\n\n- Continuously monitors the blockchain for new blocks and transactions.\n- Processes and categorizes this data according to predefined schemas.\n- Stores the processed data in an easily queryable database.\n- Provides efficient APIs (typically [GraphQL](https://graphql.org/){target=\\_blank}) for data retrieval."} {"page_id": "develop-toolkit-integrations-indexers", "page_title": "Indexers", "index": 2, "depth": 2, "title": "Indexer Implementations", "anchor": "indexer-implementations", "start_char": 1211, "end_char": 2230, "estimated_token_count": 217, "token_estimator": "heuristic-v1", "text": "## Indexer Implementations\n\n
\n\n- __Subsquid__\n\n ---\n\n Subsquid is a data network that allows rapid and cost-efficient retrieval of blockchain data from 100+ chains using Subsquid's decentralized data lake and open-source SDK. In simple terms, Subsquid can be considered an ETL (extract, transform, and load) tool with a GraphQL server included. It enables comprehensive filtering, pagination, and even full-text search capabilities. Subsquid has native and full support for EVM and Substrate data, even within the same project.\n\n [:octicons-arrow-right-24: Reference](https://www.sqd.ai/){target=\\_blank}\n\n- __Subquery__\n\n ---\n\n SubQuery is a fast, flexible, and reliable open-source data decentralised infrastructure network that provides both RPC and indexed data to consumers worldwide.\n It provides custom APIs for your web3 project across multiple supported chains.\n\n [:octicons-arrow-right-24: Reference](https://subquery.network/){target=\\_blank}\n\n
"} @@ -623,10 +623,10 @@ {"page_id": "develop-toolkit-parachains-fork-chains-chopsticks-get-started", "page_title": "Get Started", "index": 9, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 10537, "end_char": 10894, "estimated_token_count": 91, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n
\n\n- Tutorial __Fork a Chain with Chopsticks__\n\n ---\n\n Visit this guide for step-by-step instructions for configuring and interacting with your forked chain.\n\n [:octicons-arrow-right-24: Reference](/tutorials/polkadot-sdk/testing/fork-live-chains/)\n\n
"} {"page_id": "develop-toolkit-parachains-fork-chains-chopsticks", "page_title": "Chopsticks", "index": 0, "depth": 2, "title": "What Can I Do with Chopsticks?", "anchor": "what-can-i-do-with-chopsticks", "start_char": 299, "end_char": 685, "estimated_token_count": 71, "token_estimator": "heuristic-v1", "text": "## What Can I Do with Chopsticks?\n\n- Create local forks of live networks.\n- Replay blocks to analyze behavior.\n- Test XCM interactions.\n- Simulate complex scenarios.\n- Modify network storage and state.\n\nWhether you're debugging an issue, testing new features, or exploring cross-chain interactions, Chopsticks provides a safe environment for blockchain experimentation and validation."} {"page_id": "develop-toolkit-parachains-fork-chains-chopsticks", "page_title": "Chopsticks", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 685, "end_char": 735, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-toolkit-parachains-fork-chains-chopsticks", "page_title": "Chopsticks", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 735, "end_char": 1495, "estimated_token_count": 208, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-toolkit-parachains-fork-chains-chopsticks", "page_title": "Chopsticks", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 735, "end_char": 1473, "estimated_token_count": 202, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-toolkit-parachains-fork-chains", "page_title": "Fork Chains for Testing", "index": 0, "depth": 2, "title": "Why Fork a Live Chain?", "anchor": "why-fork-a-live-chain", "start_char": 512, "end_char": 804, "estimated_token_count": 51, "token_estimator": "heuristic-v1", "text": "## Why Fork a Live Chain?\n\nForking a live chain creates a controlled environment that mirrors live network conditions. This approach enables you to:\n\n- Test features safely before deployment.\n- Debug complex interactions.\n- Validate runtime changes.\n- Experiment with network modifications."} {"page_id": "develop-toolkit-parachains-fork-chains", "page_title": "Fork Chains for Testing", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 804, "end_char": 854, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-toolkit-parachains-fork-chains", "page_title": "Fork Chains for Testing", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 854, "end_char": 1295, "estimated_token_count": 120, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "develop-toolkit-parachains-fork-chains", "page_title": "Fork Chains for Testing", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 854, "end_char": 1284, "estimated_token_count": 117, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-toolkit-parachains-light-clients", "page_title": "Light Clients", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 17, "end_char": 994, "estimated_token_count": 167, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nLight clients enable secure and efficient blockchain interaction without running a full node. They provide a trust-minimized alternative to JSON-RPC by verifying data through cryptographic proofs rather than blindly trusting remote nodes.\n\nThis guide covers:\n\n- What light clients are and how they work.\n- Their advantages compared to full nodes and JSON-RPC.\n- Available implementations in the Polkadot ecosystem.\n- How to use light clients in your applications.\n\nLight clients are particularly valuable for resource-constrained environments and applications requiring secure, decentralized blockchain access without the overhead of maintaining full nodes.\n\n!!!note \"Light node or light client?\"\n The terms _light node_ and _light client_ are interchangeable. Both refer to a blockchain client that syncs without downloading the entire blockchain state. All nodes in a blockchain network are fundamentally clients, engaging in peer-to-peer communication."} {"page_id": "develop-toolkit-parachains-light-clients", "page_title": "Light Clients", "index": 1, "depth": 2, "title": "Light Clients Workflow", "anchor": "light-clients-workflow", "start_char": 994, "end_char": 2625, "estimated_token_count": 359, "token_estimator": "heuristic-v1", "text": "## Light Clients Workflow\n\nUnlike JSON-RPC interfaces, where an application must maintain a list of providers or rely on a single node, light clients are not limited to or dependent on a single node. They use cryptographic proofs to verify the blockchain's state, ensuring it is up-to-date and accurate. By verifying only block headers, light clients avoid syncing the entire state, making them ideal for resource-constrained environments.\n\n```mermaid\nflowchart LR\nDAPP([dApp])-- Query Account Info -->LC([Light Client])\nLC -- Request --> FN(((Full Node)))\nLC -- Response --> DAPP\nFN -- Response (validated via Merkle proof) --> LC\n```\n\nIn the diagram above, the decentralized application queries on-chain account information through the light client. The light client runs as part of the application and requires minimal memory and computational resources. It uses Merkle proofs to verify the state retrieved from a full node in a trust-minimized manner. Polkadot-compatible light clients utilize [warp syncing](https://spec.polkadot.network/sect-lightclient#sect-sync-warp-lightclient){target=\\_blank}, which downloads only block headers.\n\nLight clients can quickly verify the blockchain's state, including [GRANDPA finality](/polkadot-protocol/glossary#grandpa){target=\\_blank} justifications.\n\n!!!note \"What does it mean to be trust-minimized?\"\n _Trust-minimized_ means that the light client does not need to fully trust the full node from which it retrieves the state. This is achieved through the use of Merkle proofs, which allow the light client to verify the correctness of the state by checking the Merkle tree root."} {"page_id": "develop-toolkit-parachains-light-clients", "page_title": "Light Clients", "index": 2, "depth": 2, "title": "JSON-RPC and Light Client Comparison", "anchor": "json-rpc-and-light-client-comparison", "start_char": 2625, "end_char": 4478, "estimated_token_count": 442, "token_estimator": "heuristic-v1", "text": "## JSON-RPC and Light Client Comparison\n\nAnother common method of communication between a user interface (UI) and a node is through the JSON-RPC protocol. Generally, the UI retrieves information from the node, fetches network or [pallet](/polkadot-protocol/glossary#pallet){target=\\_blank} data, and interacts with the blockchain. This is typically done in one of two ways:\n\n- **User-controlled nodes**: The UI connects to a node client installed on the user's machine.\n - These nodes are secure, but installation and maintenance can be inconvenient.\n- **Publicly accessible nodes**: The UI connects to a third-party-owned publicly accessible node client.\n - These nodes are convenient but centralized and less secure. Applications must maintain a list of backup nodes in case the primary node becomes unavailable.\n\nWhile light clients still communicate with [full nodes](/polkadot-protocol/glossary#full-node), they offer significant advantages for applications requiring a secure alternative to running a full node:\n\n| Full Node | Light Client |\n| :---------------------------------------------------------------------------------------------: | :------------------------------------------------------------: |\n| Fully verifies all blocks of the chain | Verifies only the authenticity of blocks |\n| Stores previous block data and the chain's storage in a database | Does not require a database |\n| Installation, maintenance, and execution are resource-intensive and require technical expertise | No installation is typically included as part of the application |"} @@ -694,10 +694,10 @@ {"page_id": "develop-toolkit-parachains-spawn-chains-zombienet-write-tests", "page_title": "Write Tests", "index": 7, "depth": 2, "title": "Example Test Files", "anchor": "example-test-files", "start_char": 9313, "end_char": 11297, "estimated_token_count": 504, "token_estimator": "heuristic-v1", "text": "## Example Test Files\n\nThe following example test files define two tests, a small network test and a big network test. Each test defines a network configuration file and credentials to use.\n\nThe tests define assertions to evaluate the network’s metrics and logs. The assertions are defined by sentences in the DSL, which are mapped to tests to run.\n\n```toml title=\"small-network-test.zndsl\"\nDescription = \"Small Network test\"\nNetwork = \"./0000-test-config-small-network.toml\"\nCreds = \"config\"\n\n# Metrics\n[[metrics]]\nnode_roles = 4\nsub_libp2p_is_major_syncing = 0\n\n# Logs\n[[logs]]\nbob_log_line_glob = \"*rted #1*\"\nbob_log_line_regex = \"Imported #[0-9]+\"\n\n```\n\nAnd the second test file:\n\n```toml title=\"big-network-test.zndsl\"\nDescription = \"Big Network test\"\nNetwork = \"./0001-test-config-big-network.toml\"\nCreds = \"config\"\n\n# Metrics\n[[metrics]]\nnode_roles = 4\nsub_libp2p_is_major_syncing = 0\n\n# Logs\n[[logs]]\nbob_log_line_glob = \"*rted #1*\"\nbob_log_line_regex = \"Imported #[0-9]+\"\n\n# Custom JS script\n[[custom_scripts]]\nalice_js_script = { path = \"./0008-custom.js\", condition = \"return is greater than 1\", timeout = 200 }\n\n# Custom TS script\n[[custom_scripts]]\nalice_ts_script = { path = \"./0008-custom-ts.ts\", condition = \"return is greater than 1\", timeout = 200 }\n\n# Backchannel\n[[backchannel]]\nalice_wait_for_name = { use_as = \"X\", timeout = 30 }\n\n# Well-known functions\n[[functions]]\nalice_is_up = true\nalice_parachain_100_registered = { condition = \"within\", timeout = 225 }\nalice_parachain_100_block_height = { condition = \"at least 10\", timeout = 250 }\n\n# Histogram\n[[histogram]]\nalice_polkadot_pvf_execution_time = { min_samples = 2, buckets = [\n \"0.1\",\n \"0.25\",\n \"0.5\",\n \"+Inf\",\n], timeout = 100 }\n\n# System events\n[[system_events]]\nalice_system_event_matches = { pattern = \"\\\"paraId\\\":[0-9]+\", timeout = 10 }\n\n# Tracing\n[[tracing]]\nalice_trace = { traceID = \"94c1501a78a0d83c498cc92deec264d9\", contains = [\n \"answer-chunk-request\",\n \"answer-chunk-request\",\n] }\n\n```"} {"page_id": "develop-toolkit-parachains-spawn-chains-zombienet", "page_title": "Zombienet", "index": 0, "depth": 2, "title": "What Can I Do with Zombienet?", "anchor": "what-can-i-do-with-zombienet", "start_char": 350, "end_char": 729, "estimated_token_count": 66, "token_estimator": "heuristic-v1", "text": "## What Can I Do with Zombienet?\n\n- Deploy test networks with multiple nodes.\n- Validate network behavior and performance.\n- Monitor metrics and system events.\n- Execute custom test scenarios.\n\nWhether you're building a new parachain or testing runtime upgrades, Zombienet provides the tools needed to ensure your blockchain functions correctly before deployment to production."} {"page_id": "develop-toolkit-parachains-spawn-chains-zombienet", "page_title": "Zombienet", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 729, "end_char": 779, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-toolkit-parachains-spawn-chains-zombienet", "page_title": "Zombienet", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 779, "end_char": 1237, "estimated_token_count": 115, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n
\n "} +{"page_id": "develop-toolkit-parachains-spawn-chains-zombienet", "page_title": "Zombienet", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 779, "end_char": 1226, "estimated_token_count": 112, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n
\n "} {"page_id": "develop-toolkit-parachains-spawn-chains", "page_title": "Spawn Networks for Testing", "index": 0, "depth": 2, "title": "Why Spawn a Network?", "anchor": "why-spawn-a-network", "start_char": 448, "end_char": 727, "estimated_token_count": 51, "token_estimator": "heuristic-v1", "text": "## Why Spawn a Network?\n\nSpawning a network provides a controlled environment to test and validate various aspects of your blockchain. Use these tools to:\n\n- Validate network configurations.\n- Test cross-chain messaging.\n- Verify runtime upgrades.\n- Debug complex interactions."} {"page_id": "develop-toolkit-parachains-spawn-chains", "page_title": "Spawn Networks for Testing", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 727, "end_char": 777, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "develop-toolkit-parachains-spawn-chains", "page_title": "Spawn Networks for Testing", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 777, "end_char": 1199, "estimated_token_count": 108, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n
\n "} +{"page_id": "develop-toolkit-parachains-spawn-chains", "page_title": "Spawn Networks for Testing", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 777, "end_char": 1188, "estimated_token_count": 105, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n
\n "} {"page_id": "develop-toolkit-parachains", "page_title": "Parachains", "index": 0, "depth": 2, "title": "Quick Links", "anchor": "quick-links", "start_char": 600, "end_char": 1005, "estimated_token_count": 110, "token_estimator": "heuristic-v1", "text": "## Quick Links\n\n- [Use Pop CLI to start your parachain project](/develop/toolkit/parachains/quickstart/pop-cli/)\n- [Use Zombienet to spawn a chain](/develop/toolkit/parachains/spawn-chains/zombienet/get-started/)\n- [Use Chopsticks to fork a chain](/develop/toolkit/parachains/fork-chains/chopsticks/get-started/)\n- [Use Moonwall to execute E2E testing](/develop/toolkit/parachains/e2e-testing/moonwall/)"} {"page_id": "develop-toolkit-parachains", "page_title": "Parachains", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1005, "end_char": 1054, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "develop-toolkit", "page_title": "Toolkit", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 853, "end_char": 902, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} @@ -786,15 +786,15 @@ {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 8, "depth": 3, "title": "Activate using Polkadot.js Apps", "anchor": "activate-using-polkadotjs-apps", "start_char": 9803, "end_char": 11084, "estimated_token_count": 345, "token_estimator": "heuristic-v1", "text": "### Activate using Polkadot.js Apps\n\nFollow these steps to use Polkadot.js Apps to activate your validator:\n\n1. In Polkadot.js Apps, navigate to **Network** and select **Staking**:\n\n ![](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-01.webp)\n\n2. Open the **Accounts** tab and click on **+ Validator**:\n\n ![](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-02.webp)\n\n3. Set a bond amount in the **value bonded** field and then click **next**:\n\n ![](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-03.webp)\n\n4. **Set session keys**. Paste the output from `author_rotateKeys` (hex-encoded) to link your validator with its session keys. Then click **Bond & Validate** to continue:\n\n ![](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-04.webp)\n\nYou can set the **commission** and the **blocked** option via `staking.validate` extrinsic. By default, the blocked option is set to FALSE (i.e., the validator accepts nominations):\n\n![](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-05.webp)"} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 9, "depth": 3, "title": "Monitor Validation Status and Slots", "anchor": "monitor-validation-status-and-slots", "start_char": 11084, "end_char": 12036, "estimated_token_count": 217, "token_estimator": "heuristic-v1", "text": "### Monitor Validation Status and Slots\n\nOn the [**Staking**](https://polkadot.js.org/apps/#/staking){target=\\_blank} tab in Polkadot.js Apps, you can see your validator's status, the number of available validator slots, and the nodes that have signaled their intent to validate. Your node may initially appear in the waiting queue, especially if the validator slots are full. The following is an example view of the **Staking** tab:\n\n![staking queue](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-06.webp)\n\nThe validator set refreshes each era. If there's an available slot in the next era, your node may be selected to move from the waiting queue to the active validator set, allowing it to start validating blocks. If your validator is not selected, it remains in the waiting queue. Increasing your stake or gaining more nominators may improve your chance of being selected in future eras."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 10, "depth": 2, "title": "Run a Validator Using Systemd", "anchor": "run-a-validator-using-systemd", "start_char": 12036, "end_char": 13060, "estimated_token_count": 218, "token_estimator": "heuristic-v1", "text": "## Run a Validator Using Systemd\n\nRunning your Polkadot validator as a [systemd](https://en.wikipedia.org/wiki/Systemd){target=\\_blank} service is an effective way to ensure its high uptime and reliability. Using systemd allows your validator to automatically restart after server reboots or unexpected crashes, significantly reducing the risk of slashing due to downtime.\n\nThis following sections will walk you through creating and managing a systemd service for your validator, allowing you to seamlessly monitor and control it as part of your Linux system. \n\nEnsure the following requirements are met before proceeding with the systemd setup:\n\n- Confirm your system meets the [requirements](/infrastructure/running-a-validator/requirements/){target=\\_blank} for running a validator.\n- Ensure you meet the [minimum bond requirements](https://wiki.polkadot.com/general/chain-state-values/#minimum-validator-bond){target=\\_blank} for validating.\n- Verify the Polkadot binary is [installed](#install-the-polkadot-binaries)."} -{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 11, "depth": 3, "title": "Create the Systemd Service File", "anchor": "create-the-systemd-service-file", "start_char": 13060, "end_char": 14799, "estimated_token_count": 338, "token_estimator": "heuristic-v1", "text": "### Create the Systemd Service File\n\nFirst create a new unit file called `polkadot-validator.service` in `/etc/systemd/system/`:\n\n```bash\ntouch /etc/systemd/system/polkadot-validator.service\n```\n\nIn this unit file, you will write the commands that you want to run on server boot/restart:\n\n```systemd title=\"/etc/systemd/system/polkadot-validator.service\"\n[Unit]\nDescription=Polkadot Node\nAfter=network.target\nDocumentation=https://github.com/paritytech/polkadot-sdk\n\n[Service]\nEnvironmentFile=-/etc/default/polkadot\nExecStart=/usr/bin/polkadot $POLKADOT_CLI_ARGS\nUser=polkadot\nGroup=polkadot\nRestart=always\nRestartSec=120\nCapabilityBoundingSet=\nLockPersonality=true\nNoNewPrivileges=true\nPrivateDevices=true\nPrivateMounts=true\nPrivateTmp=true\nPrivateUsers=true\nProtectClock=true\nProtectControlGroups=true\nProtectHostname=true\nProtectKernelModules=true\nProtectKernelTunables=true\nProtectSystem=strict\nRemoveIPC=true\nRestrictAddressFamilies=AF_INET AF_INET6 AF_NETLINK AF_UNIX\nRestrictNamespaces=false\nRestrictSUIDSGID=true\nSystemCallArchitectures=native\nSystemCallFilter=@system-service\nSystemCallFilter=landlock_add_rule landlock_create_ruleset landlock_restrict_self seccomp mount umount2\nSystemCallFilter=~@clock @module @reboot @swap @privileged\nSystemCallFilter=pivot_root\nUMask=0027\n\n[Install]\nWantedBy=multi-user.target\n```\n\n!!! warning \"Restart delay and equivocation risk\"\n It is recommended that a node's restart be delayed with `RestartSec` in the case of a crash. It's possible that when a node crashes, consensus votes in GRANDPA aren't persisted to disk. In this case, there is potential to equivocate when immediately restarting. Delaying the restart will allow the network to progress past potentially conflicting votes."} -{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 12, "depth": 3, "title": "Run the Service", "anchor": "run-the-service", "start_char": 14799, "end_char": 15774, "estimated_token_count": 243, "token_estimator": "heuristic-v1", "text": "### Run the Service\n\nActivate the systemd service to start on system boot by running:\n\n```bash\nsystemctl enable polkadot-validator.service\n```\n\nTo start the service manually, use:\n\n```bash\nsystemctl start polkadot-validator.service\n```\n\nCheck the service's status to confirm it is running:\n\n```bash\nsystemctl status polkadot-validator.service\n```\n\nTo view the logs in real-time, use [journalctl](https://www.freedesktop.org/software/systemd/man/latest/journalctl.html){target=\\_blank} like so:\n\n```bash\njournalctl -f -u polkadot-validator\n```\n\nWith these steps, you can effectively manage and monitor your validator as a systemd service.\n\nOnce your validator is active, it's officially part of Polkadot's security infrastructure. For questions or further support, you can reach out to the [Polkadot Validator chat](https://matrix.to/#/!NZrbtteFeqYKCUGQtr:matrix.parity.io?via=matrix.parity.io&via=matrix.org&via=web3.foundation){target=\\_blank} for tips and troubleshooting."} +{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 11, "depth": 3, "title": "Create the Systemd Service File", "anchor": "create-the-systemd-service-file", "start_char": 13060, "end_char": 13830, "estimated_token_count": 178, "token_estimator": "heuristic-v1", "text": "### Create the Systemd Service File\n\nFirst create a new unit file called `polkadot-validator.service` in `/etc/systemd/system/`:\n\n```bash\ntouch /etc/systemd/system/polkadot-validator.service\n```\n\nIn this unit file, you will write the commands that you want to run on server boot/restart:\n\n```systemd title=\"/etc/systemd/system/polkadot-validator.service\"\n\n```\n\n!!! warning \"Restart delay and equivocation risk\"\n It is recommended that a node's restart be delayed with `RestartSec` in the case of a crash. It's possible that when a node crashes, consensus votes in GRANDPA aren't persisted to disk. In this case, there is potential to equivocate when immediately restarting. Delaying the restart will allow the network to progress past potentially conflicting votes."} +{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 12, "depth": 3, "title": "Run the Service", "anchor": "run-the-service", "start_char": 13830, "end_char": 14805, "estimated_token_count": 243, "token_estimator": "heuristic-v1", "text": "### Run the Service\n\nActivate the systemd service to start on system boot by running:\n\n```bash\nsystemctl enable polkadot-validator.service\n```\n\nTo start the service manually, use:\n\n```bash\nsystemctl start polkadot-validator.service\n```\n\nCheck the service's status to confirm it is running:\n\n```bash\nsystemctl status polkadot-validator.service\n```\n\nTo view the logs in real-time, use [journalctl](https://www.freedesktop.org/software/systemd/man/latest/journalctl.html){target=\\_blank} like so:\n\n```bash\njournalctl -f -u polkadot-validator\n```\n\nWith these steps, you can effectively manage and monitor your validator as a systemd service.\n\nOnce your validator is active, it's officially part of Polkadot's security infrastructure. For questions or further support, you can reach out to the [Polkadot Validator chat](https://matrix.to/#/!NZrbtteFeqYKCUGQtr:matrix.parity.io?via=matrix.parity.io&via=matrix.org&via=web3.foundation){target=\\_blank} for tips and troubleshooting."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-stop-validating", "page_title": "Stop Validating", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 19, "end_char": 498, "estimated_token_count": 89, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIf you're ready to stop validating on Polkadot, there are essential steps to ensure a smooth transition while protecting your funds and account integrity. Whether you're taking a break for maintenance or unbonding entirely, you'll need to chill your validator, purge session keys, and unbond your tokens. This guide explains how to use Polkadot's tools and extrinsics to safely withdraw from validation activities, safeguarding your account's future usability."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-stop-validating", "page_title": "Stop Validating", "index": 1, "depth": 2, "title": "Pause Versus Stop", "anchor": "pause-versus-stop", "start_char": 498, "end_char": 920, "estimated_token_count": 83, "token_estimator": "heuristic-v1", "text": "## Pause Versus Stop\n\nIf you wish to remain a validator or nominator (for example, stopping for planned downtime or server maintenance), submitting the `chill` extrinsic in the `staking` pallet should suffice. Additional steps are only needed to unbond funds or reap an account.\n\nThe following are steps to ensure a smooth stop to validation:\n\n- Chill the validator.\n- Purge validator session keys.\n- Unbond your tokens."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-stop-validating", "page_title": "Stop Validating", "index": 2, "depth": 2, "title": "Chill Validator", "anchor": "chill-validator", "start_char": 920, "end_char": 1499, "estimated_token_count": 117, "token_estimator": "heuristic-v1", "text": "## Chill Validator\n\nWhen stepping back from validating, the first step is to chill your validator status. This action stops your validator from being considered for the next era without fully unbonding your tokens, which can be useful for temporary pauses like maintenance or planned downtime.\n\nUse the `staking.chill` extrinsic to initiate this. For more guidance on chilling your node, refer to the [Pause Validating](/infrastructure/running-a-validator/operational-tasks/pause-validating/){target=\\_blank} guide. You may also claim any pending staking rewards at this point."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-stop-validating", "page_title": "Stop Validating", "index": 3, "depth": 2, "title": "Purge Validator Session Keys", "anchor": "purge-validator-session-keys", "start_char": 1499, "end_char": 2530, "estimated_token_count": 194, "token_estimator": "heuristic-v1", "text": "## Purge Validator Session Keys\n\nPurging validator session keys is a critical step in removing the association between your validator account and its session keys, which ensures that your account is fully disassociated from validator activities. The `session.purgeKeys` extrinsic removes the reference to your session keys from the stash or staking proxy account that originally set them.\n\nHere are a couple of important things to know about purging keys:\n\n- **Account used to purge keys**: Always use the same account to purge keys you originally used to set them, usually your stash or staking proxy account. Using a different account may leave an unremovable reference to the session keys on the original account, preventing its reaping.\n- **Account reaping issue**: Failing to purge keys will prevent you from reaping (fully deleting) your stash account. If you attempt to transfer tokens without purging, you'll need to rebond, purge the session keys, unbond again, and wait through the unbonding period before any transfer."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-stop-validating", "page_title": "Stop Validating", "index": 4, "depth": 2, "title": "Unbond Your Tokens", "anchor": "unbond-your-tokens", "start_char": 2530, "end_char": 3228, "estimated_token_count": 142, "token_estimator": "heuristic-v1", "text": "## Unbond Your Tokens\n\nAfter chilling your node and purging session keys, the final step is to unbond your staked tokens. This action removes them from staking and begins the unbonding period (usually 28 days for Polkadot and seven days for Kusama), after which the tokens will be transferable.\n\nTo unbond tokens, go to **Network > Staking > Account Actions** on Polkadot.js Apps. Select your stash account, click on the dropdown menu, and choose **Unbond Funds**. Alternatively, you can use the `staking.unbond` extrinsic if you handle this via a staking proxy account.\n\nOnce the unbonding period is complete, your tokens will be available for use in transactions or transfers outside of staking."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding", "page_title": "Onboarding and Offboarding", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 381, "end_char": 431, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding", "page_title": "Onboarding and Offboarding", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 431, "end_char": 1975, "estimated_token_count": 404, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding", "page_title": "Onboarding and Offboarding", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 431, "end_char": 1931, "estimated_token_count": 392, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "infrastructure-running-a-validator-operational-tasks-general-management", "page_title": "General Management", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 22, "end_char": 759, "estimated_token_count": 119, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nValidator performance is pivotal in maintaining the security and stability of the Polkadot network. As a validator, optimizing your setup ensures efficient transaction processing, minimizes latency, and maintains system reliability during high-demand periods. Proper configuration and proactive monitoring also help mitigate risks like slashing and service interruptions.\n\nThis guide covers essential practices for managing a validator, including performance tuning techniques, security hardening, and tools for real-time monitoring. Whether you're fine-tuning CPU settings, configuring NUMA balancing, or setting up a robust alert system, these steps will help you build a resilient and efficient validator operation."} {"page_id": "infrastructure-running-a-validator-operational-tasks-general-management", "page_title": "General Management", "index": 1, "depth": 2, "title": "Configuration Optimization", "anchor": "configuration-optimization", "start_char": 759, "end_char": 987, "estimated_token_count": 35, "token_estimator": "heuristic-v1", "text": "## Configuration Optimization\n\nFor those seeking to optimize their validator's performance, the following configurations can improve responsiveness, reduce latency, and ensure consistent performance during high-demand periods."} {"page_id": "infrastructure-running-a-validator-operational-tasks-general-management", "page_title": "General Management", "index": 2, "depth": 3, "title": "Deactivate Simultaneous Multithreading", "anchor": "deactivate-simultaneous-multithreading", "start_char": 987, "end_char": 2478, "estimated_token_count": 333, "token_estimator": "heuristic-v1", "text": "### Deactivate Simultaneous Multithreading\n\nPolkadot validators operate primarily in single-threaded mode for critical tasks, so optimizing single-core CPU performance can reduce latency and improve stability. Deactivating simultaneous multithreading (SMT) can prevent virtual cores from affecting performance. SMT is called Hyper-Threading on Intel and 2-way SMT on AMD Zen.\n\nTake the following steps to deactivate every other (vCPU) core:\n\n1. Loop though all the CPU cores and deactivate the virtual cores associated with them:\n\n ```bash\n for cpunum in $(cat /sys/devices/system/cpu/cpu*/topology/thread_siblings_list | \\\n cut -s -d, -f2- | tr ',' '\\n' | sort -un)\n do\n echo 0 > /sys/devices/system/cpu/cpu$cpunum/online\n done\n ```\n\n2. To permanently save the changes, add `nosmt=force` to the `GRUB_CMDLINE_LINUX_DEFAULT` variable in `/etc/default/grub`:\n\n ```bash\n sudo nano /etc/default/grub\n # Add to GRUB_CMDLINE_LINUX_DEFAULT\n ```\n\n ```config title=\"/etc/default/grub\"\n GRUB_DEFAULT = 0;\n GRUB_HIDDEN_TIMEOUT = 0;\n GRUB_HIDDEN_TIMEOUT_QUIET = true;\n GRUB_TIMEOUT = 10;\n GRUB_DISTRIBUTOR = `lsb_release -i -s 2> /dev/null || echo Debian`;\n GRUB_CMDLINE_LINUX_DEFAULT = 'nosmt=force';\n GRUB_CMDLINE_LINUX = '';\n ```\n\n3. Update GRUB to apply changes:\n\n ```bash\n sudo update-grub\n ```\n\n4. After the reboot, you should see that half of the cores are offline. To confirm, run:\n\n ```bash\n lscpu --extended\n ```"} @@ -827,14 +827,14 @@ {"page_id": "infrastructure-running-a-validator-operational-tasks-upgrade-your-node", "page_title": "Upgrade a Validator Node", "index": 5, "depth": 3, "title": "Session `N`", "anchor": "session-n", "start_char": 3111, "end_char": 4063, "estimated_token_count": 216, "token_estimator": "heuristic-v1", "text": "### Session `N`\n\n1. **Start Validator B**: Launch a secondary node and wait until it is fully synced with the network. Once synced, start it with the `--validator` flag. This node will now act as Validator B.\n2. **Generate session keys**: Create new session keys specifically for Validator B.\n3. **Submit the `set_key` extrinsic**: Use your staking proxy account to submit a `set_key` extrinsic, linking the session keys for Validator B to your staking setup.\n4. **Record the session**: Make a note of the session in which you executed this extrinsic.\n5. **Wait for session changes**: Allow the current session to end and then wait for two additional full sessions for the new keys to take effect.\n\n!!! warning \"Keep Validator A running\"\n\n It is crucial to keep Validator A operational during this entire waiting period. Since `set_key` does not take effect immediately, turning off Validator A too early may result in chilling or even slashing."} {"page_id": "infrastructure-running-a-validator-operational-tasks-upgrade-your-node", "page_title": "Upgrade a Validator Node", "index": 6, "depth": 3, "title": "Session `N+3`", "anchor": "session-n3", "start_char": 4063, "end_char": 5624, "estimated_token_count": 378, "token_estimator": "heuristic-v1", "text": "### Session `N+3`\n\nAt this stage, Validator B becomes your active validator. You can now safely perform any maintenance tasks on Validator A.\n\nComplete the following steps when you are ready to bring Validator A back online:\n\n1. **Start Validator A**: Launch Validator A, sync the blockchain database, and ensure it is running with the `--validator` flag.\n2. **Generate new session keys for Validator A**: Create fresh session keys for Validator A.\n3. **Submit the `set_key` extrinsic**: Using your staking proxy account, submit a `set_key` extrinsic with the new Validator A session keys.\n4. **Record the session**: Again, make a note of the session in which you executed this extrinsic.\n\nKeep Validator B active until the session during which you executed the `set-key` extrinsic completes plus two additional full sessions have passed. Once Validator A has successfully taken over, you can safely stop Validator B. This process helps ensure a smooth handoff between nodes and minimizes the risk of downtime or penalties. Verify the transition by checking for finalized blocks in the new session. The logs should indicate the successful change, similar to the example below:\n\n
\n INSERT_COMMAND\n 2019-10-28 21:44:13 Applying authority set change scheduled at block #450092\n 2019-10-28 21:44:13 Applying GRANDPA set change to new set with 20 authorities\n \n
"} {"page_id": "infrastructure-running-a-validator-operational-tasks", "page_title": "Operational Tasks", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 593, "end_char": 643, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "infrastructure-running-a-validator-operational-tasks", "page_title": "Operational Tasks", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 643, "end_char": 1520, "estimated_token_count": 224, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "infrastructure-running-a-validator-operational-tasks", "page_title": "Operational Tasks", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 643, "end_char": 1498, "estimated_token_count": 218, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "infrastructure-running-a-validator-requirements", "page_title": "Validator Requirements", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 26, "end_char": 981, "estimated_token_count": 159, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nRunning a validator in the Polkadot ecosystem is essential for maintaining network security and decentralization. Validators are responsible for validating transactions and adding new blocks to the chain, ensuring the system operates smoothly. In return for their services, validators earn rewards. However, the role comes with inherent risks, such as slashing penalties for misbehavior or technical failures. If you’re new to validation, starting on Kusama provides a lower-stakes environment to gain valuable experience before progressing to the Polkadot network.\n\nThis guide covers everything you need to know about becoming a validator, including system requirements, staking prerequisites, and infrastructure setup. Whether you’re deploying on a VPS or running your node on custom hardware, you’ll learn how to optimize your validator for performance and security, ensuring compliance with network standards while minimizing risks."} {"page_id": "infrastructure-running-a-validator-requirements", "page_title": "Validator Requirements", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 981, "end_char": 2390, "estimated_token_count": 296, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nRunning a validator requires solid system administration skills and a secure, well-maintained infrastructure. Below are the primary requirements you need to be aware of before getting started:\n\n- **System administration expertise**: Handling technical anomalies and maintaining node infrastructure is critical. Validators must be able to troubleshoot and optimize their setup.\n- **Security**: Ensure your setup follows best practices for securing your node. Refer to the [Secure Your Validator](/infrastructure/running-a-validator/operational-tasks/general-management/#secure-your-validator){target=\\_blank} section to learn about important security measures.\n- **Network choice**: Start with [Kusama](/infrastructure/running-a-validator/onboarding-and-offboarding/set-up-validator/#run-a-kusama-validator){target=\\_blank} to gain experience. Look for \"Adjustments for Kusama\" throughout these guides for tips on adapting the provided instructions for the Kusama network.\n- **Staking requirements**: A minimum amount of native token (KSM or DOT) is required to be elected into the validator set. The required stake can come from your own holdings or from nominators.\n- **Risk of slashing**: Any DOT you stake is at risk if your setup fails or your validator misbehaves. If you’re unsure of your ability to maintain a reliable validator, consider nominating your DOT to a trusted validator."} {"page_id": "infrastructure-running-a-validator-requirements", "page_title": "Validator Requirements", "index": 2, "depth": 2, "title": "Minimum Hardware Requirements", "anchor": "minimum-hardware-requirements", "start_char": 2390, "end_char": 3554, "estimated_token_count": 251, "token_estimator": "heuristic-v1", "text": "## Minimum Hardware Requirements\n\nPolkadot validators rely on high-performance hardware to process blocks efficiently. The recommended minimum hardware requirements to ensure a fully functional and performant validator are as follows:\n\n- CPU:\n\n - x86-64 compatible.\n - Eight physical cores @ 3.4 GHz.\n - Processor:\n - **Intel**: Ice Lake or newer (Xeon or Core series)\n - **AMD**: Zen3 or newer (EPYC or Ryzen)\n - Simultaneous multithreading disabled:\n - **Intel**: Hyper-Threading\n - **AMD**: SMT\n - [Single-threaded performance](https://www.cpubenchmark.net/singleThread.html){target=\\_blank} is prioritized over higher cores count.\n\n- Storage:\n\n - **NVMe SSD**: At least 2 TB for blockchain data recommended (prioritize latency rather than throughput).\n - Storage requirements will increase as the chain grows. For current estimates, see the [current chain snapshot](https://stakeworld.io/docs/dbsize){target=\\_blank}.\n\n- Memory:\n\n - 32 GB DDR4 ECC\n\n- Network:\n\n - Symmetric networking speed of 500 Mbit/s is required to handle large numbers of parachains and ensure congestion control during peak times."} {"page_id": "infrastructure-running-a-validator-requirements", "page_title": "Validator Requirements", "index": 3, "depth": 2, "title": "VPS Provider List", "anchor": "vps-provider-list", "start_char": 3554, "end_char": 6073, "estimated_token_count": 575, "token_estimator": "heuristic-v1", "text": "## VPS Provider List\n\nWhen selecting a VPS provider for your validator node, prioritize reliability, consistent performance, and adherence to the specific hardware requirements set for Polkadot validators. The following server types have been tested and showed acceptable performance in benchmark tests. However, this is not an endorsement and actual performance may vary depending on your workload and VPS provider.\n\nBe aware that some providers may overprovision the underlying host and use shared storage such as NVMe over TCP, which appears as local storage. These setups might result in poor or inconsistent performance. Benchmark your infrastructure before deploying.\n\n- **[Google Cloud Platform (GCP)](https://cloud.google.com/){target=\\_blank}**: `c2` and `c2d` machine families offer high-performance configurations suitable for validators.\n- **[Amazon Web Services (AWS)](https://aws.amazon.com/){target=\\_blank}**: `c6id` machine family provides strong performance, particularly for I/O-intensive workloads.\n- **[OVH](https://www.ovhcloud.com/en-au/){target=\\_blank}**: Can be a budget-friendly solution if it meets your minimum hardware specifications.\n- **[Digital Ocean](https://www.digitalocean.com/){target=\\_blank}**: Popular among developers, Digital Ocean's premium droplets offer configurations suitable for medium to high-intensity workloads.\n- **[Vultr](https://www.vultr.com/){target=\\_blank}**: Offers flexibility with plans that may meet validator requirements, especially for high-bandwidth needs.\n- **[Linode](https://www.linode.com/){target=\\_blank}**: Provides detailed documentation, which can be helpful for setup.\n- **[Scaleway](https://www.scaleway.com/en/){target=\\_blank}**: Offers high-performance cloud instances that can be suitable for validator nodes.\n- **[OnFinality](https://onfinality.io/en){target=\\_blank}**: Specialized in blockchain infrastructure, OnFinality provides validator-specific support and configurations.\n\n!!! warning \"Acceptable use policies\"\n Different VPS providers have varying acceptable use policies, and not all allow cryptocurrency-related activities. \n\n For example, Digital Ocean, requires explicit permission to use servers for cryptocurrency mining and defines unauthorized mining as [network abuse](https://www.digitalocean.com/legal/acceptable-use-policy#network-abuse){target=\\_blank} in their acceptable use policy. \n \n Review the terms for your VPS provider to avoid account suspension or server shutdown due to policy violations."} {"page_id": "infrastructure-running-a-validator-requirements", "page_title": "Validator Requirements", "index": 4, "depth": 2, "title": "Minimum Bond Requirement", "anchor": "minimum-bond-requirement", "start_char": 6073, "end_char": 6838, "estimated_token_count": 196, "token_estimator": "heuristic-v1", "text": "## Minimum Bond Requirement\n\nBefore bonding DOT, ensure you meet the minimum bond requirement to start a validator instance. The minimum bond is the least DOT you need to stake to enter the validator set. To become eligible for rewards, your validator node must be nominated by enough staked tokens.\n\nFor example, on November 19, 2024, the minimum stake backing a validator in Polkadot's era 1632 was 1,159,434.248 DOT. You can check the current minimum stake required using these tools:\n\n- [**Chain State Values**](https://wiki.polkadot.com/general/chain-state-values/){target=\\_blank}\n- [**Subscan**](https://polkadot.subscan.io/validator_list?status=validator){target=\\_blank}\n- [**Staking Dashboard**](https://staking.polkadot.cloud/#/overview){target=\\_blank}"} {"page_id": "infrastructure-running-a-validator", "page_title": "Running a Validator", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 412, "end_char": 462, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "infrastructure-running-a-validator", "page_title": "Running a Validator", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 462, "end_char": 1603, "estimated_token_count": 307, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "infrastructure-running-a-validator", "page_title": "Running a Validator", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 462, "end_char": 1570, "estimated_token_count": 298, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "infrastructure-staking-mechanics-offenses-and-slashes", "page_title": "Offenses and Slashes", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 24, "end_char": 674, "estimated_token_count": 104, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIn Polkadot's Nominated Proof of Stake (NPoS) system, validator misconduct is deterred through a combination of slashing, disabling, and reputation penalties. Validators and nominators who stake tokens face consequences for validator misbehavior, which range from token slashes to restrictions on network participation.\n\nThis page outlines the types of offenses recognized by Polkadot, including block equivocations and invalid votes, as well as the corresponding penalties. While some parachains may implement additional custom slashing mechanisms, this guide focuses on the offenses tied to staking within the Polkadot ecosystem."} {"page_id": "infrastructure-staking-mechanics-offenses-and-slashes", "page_title": "Offenses and Slashes", "index": 1, "depth": 2, "title": "Offenses", "anchor": "offenses", "start_char": 674, "end_char": 1106, "estimated_token_count": 86, "token_estimator": "heuristic-v1", "text": "## Offenses\n\nPolkadot is a public permissionless network. As such, it has a mechanism to disincentivize offenses and incentivize good behavior. You can review the [parachain protocol](https://wiki.polkadot.com/learn/learn-parachains-protocol/#parachain-protocol){target=\\_blank} to understand better the terminology used to describe offenses. Polkadot validator offenses fall into two categories: invalid votes and equivocations."} {"page_id": "infrastructure-staking-mechanics-offenses-and-slashes", "page_title": "Offenses and Slashes", "index": 2, "depth": 3, "title": "Invalid Votes", "anchor": "invalid-votes", "start_char": 1106, "end_char": 1733, "estimated_token_count": 128, "token_estimator": "heuristic-v1", "text": "### Invalid Votes\n\nA validator will be penalized for inappropriate voting activity during the block inclusion and approval processes. The invalid voting related offenses are as follows:\n\n- **Backing an invalid block**: A para-validator backs an invalid block for inclusion in a fork of the relay chain.\n- **`ForInvalid` vote**: When acting as a secondary checker, the validator votes in favor of an invalid block.\n- **`AgainstValid` vote**: When acting as a secondary checker, the validator votes against a valid block. This type of vote wastes network resources required to resolve the disparate votes and resulting dispute."} @@ -851,7 +851,7 @@ {"page_id": "infrastructure-staking-mechanics-rewards-payout", "page_title": "Rewards Payout", "index": 4, "depth": 2, "title": "Running Multiple Validators", "anchor": "running-multiple-validators", "start_char": 5622, "end_char": 7233, "estimated_token_count": 423, "token_estimator": "heuristic-v1", "text": "## Running Multiple Validators\n\nRunning multiple validators can offer a more favorable risk/reward ratio compared to running a single one. If you have sufficient DOT or nominators staking on your validators, maintaining multiple validators within the active set can yield higher rewards.\n\nIn the preceding section, with 18 DOT staked and no nominators, Alice earned 2 DOT in one era. This example uses DOT, but the same principles apply for KSM on the Kusama network. By managing stake across multiple validators, you can potentially increase overall returns. Recall the set of validators from the preceding section:\n\n``` mermaid\nflowchart TD\n A[\"Alice (18 DOT)\"]\n B[\"Bob (9 DOT)\"]\n C[\"Carol (8 DOT)\"]\n D[\"Dave (7 DOT)\"]\n E[\"Payout (8 DOT total)\"]\n E --\"2 DOT\"--> A\n E --\"2 DOT\"--> B\n E --\"2 DOT\"--> C\n E --\"2 DOT\"--> D \n```\n\nNow, assume Alice decides to split their stake and run two validators, each with a nine DOT stake. This validator set only has four spots and priority is given to validators with a larger stake. In this example, Dave has the smallest stake and loses his spot in the validator set. Now, Alice will earn two shares of the total payout each era as illustrated below:\n\n``` mermaid\nflowchart TD\n A[\"Alice (9 DOT)\"]\n F[\"Alice (9 DOT)\"]\n B[\"Bob (9 DOT)\"]\n C[\"Carol (8 DOT)\"]\n E[\"Payout (8 DOT total)\"]\n E --\"2 DOT\"--> A\n E --\"2 DOT\"--> B\n E --\"2 DOT\"--> C\n E --\"2 DOT\"--> F \n```\n\nWith enough stake, you could run more than two validators. However, each validator must have enough stake behind it to maintain a spot in the validator set."} {"page_id": "infrastructure-staking-mechanics-rewards-payout", "page_title": "Rewards Payout", "index": 5, "depth": 2, "title": "Nominators and Validator Payments", "anchor": "nominators-and-validator-payments", "start_char": 7233, "end_char": 11070, "estimated_token_count": 990, "token_estimator": "heuristic-v1", "text": "## Nominators and Validator Payments\n\nA nominator's stake allows them to vote for validators and earn a share of the rewards without managing a validator node. Although staking rewards depend on validator activity during an era, validators themselves never control or own nominator rewards. To trigger payouts, anyone can call the `staking.payoutStakers` or `staking.payoutStakerByPage` methods, which mint and distribute rewards directly to the recipients. This trustless process ensures nominators receive their earned rewards.\n\nValidators set a commission rate as a percentage of the block reward, affecting how rewards are shared with nominators. A 0% commission means the validator keeps only rewards from their self-stake, while a 100% commission means they retain all rewards, leaving none for nominators.\n\nThe following examples model splitting validator payments between nominator and validator using various commission percentages. For simplicity, these examples assume a Polkadot-SDK based relay chain that uses DOT as a native token and a single nominator per validator. Calculations of KSM reward payouts for Kusama follow the same formula. \n\nStart with the original validator set from the previous section: \n\n``` mermaid\nflowchart TD\n A[\"Alice (18 DOT)\"]\n B[\"Bob (9 DOT)\"]\n C[\"Carol (8 DOT)\"]\n D[\"Dave (7 DOT)\"]\n E[\"Payout (8 DOT total)\"]\n E --\"2 DOT\"--> A\n E --\"2 DOT\"--> B\n E --\"2 DOT\"--> C\n E --\"2 DOT\"--> D \n```\n\nThe preceding diagram shows each validator receiving a 2 DOT payout, but doesn't account for sharing rewards with nominators. The following diagram shows what nominator payout might look like for validator Alice. Alice has a 20% commission rate and holds 50% of the stake for their validator:\n\n``` mermaid\n\nflowchart TD\n A[\"Gross Rewards = 2 DOT\"]\n E[\"Commission = 20%\"]\n F[\"Alice Validator Payment = 0.4 DOT\"]\n G[\"Total Stake Rewards = 1.6 DOT\"]\n B[\"Alice Validator Stake = 18 DOT\"]\n C[\"9 DOT Alice (50%)\"]\n H[\"Alice Stake Reward = 0.8 DOT\"]\n I[\"Total Alice Validator Reward = 1.2 DOT\"]\n D[\"9 DOT Nominator (50%)\"]\n J[\"Total Nominator Reward = 0.8 DOT\"]\n \n A --> E\n E --(2 x 0.20)--> F\n F --(2 - 0.4)--> G\n B --> C\n B --> D\n C --(1.6 x 0.50)--> H\n H --(0.4 + 0.8)--> I\n D --(1.60 x 0.50)--> J\n```\n\nNotice the validator commission rate is applied against the gross amount of rewards for the era. The validator commission is subtracted from the total rewards. After the commission is paid to the validator, the remaining amount is split among stake owners according to their percentage of the total stake. A validator's total rewards for an era include their commission plus their piece of the stake rewards. \n\nNow, consider a different scenario for validator Bob where the commission rate is 40%, and Bob holds 33% of the stake for their validator:\n\n``` mermaid\n\nflowchart TD\n A[\"Gross Rewards = 2 DOT\"]\n E[\"Commission = 40%\"]\n F[\"Bob Validator Payment = 0.8 DOT\"]\n G[\"Total Stake Rewards = 1.2 DOT\"]\n B[\"Bob Validator Stake = 9 DOT\"]\n C[\"3 DOT Bob (33%)\"]\n H[\"Bob Stake Reward = 0.4 DOT\"]\n I[\"Total Bob Validator Reward = 1.2 DOT\"]\n D[\"6 DOT Nominator (67%)\"]\n J[\"Total Nominator Reward = 0.8 DOT\"]\n \n A --> E\n E --(2 x 0.4)--> F\n F --(2 - 0.8)--> G\n B --> C\n B --> D\n C --(1.2 x 0.33)--> H\n H --(0.8 + 0.4)--> I\n D --(1.2 x 0.67)--> J\n```\n\nBob holds a smaller percentage of their node's total stake, making their stake reward smaller than Alice's. In this scenario, Bob makes up the difference by charging a 40% commission rate and ultimately ends up with the same total payment as Alice. Each validator will need to find their ideal balance between the amount of stake and commission rate to attract nominators while still making running a validator worthwhile."} {"page_id": "infrastructure-staking-mechanics", "page_title": "Staking Mechanics", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 487, "end_char": 537, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "infrastructure-staking-mechanics", "page_title": "Staking Mechanics", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 537, "end_char": 1824, "estimated_token_count": 340, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "infrastructure-staking-mechanics", "page_title": "Staking Mechanics", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 537, "end_char": 1791, "estimated_token_count": 331, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "infrastructure", "page_title": "Infrastructure", "index": 0, "depth": 2, "title": "Choosing the Right Role", "anchor": "choosing-the-right-role", "start_char": 486, "end_char": 2813, "estimated_token_count": 439, "token_estimator": "heuristic-v1", "text": "## Choosing the Right Role\n\nSelecting your role within the Polkadot ecosystem depends on your goals, resources, and expertise. Below are detailed considerations for each role:\n\n- **Running a node**:\n - **Purpose**: A node provides access to network data and supports API queries. It is commonly used for.\n - **Development and testing**: Offers a local instance to simulate network conditions and test applications.\n - **Production use**: Acts as a data source for dApps, clients, and other applications needing reliable access to the blockchain.\n - **Requirements**: Moderate hardware resources to handle blockchain data efficiently.\n - **Responsibilities**: A node’s responsibilities vary based on its purpose.\n - **Development and testing**: Enables developers to test features, debug code, and simulate network interactions in a controlled environment.\n - **Production use**: Provides consistent and reliable data access for dApps and other applications, ensuring minimal downtime.\n\n- **Running a validator**:\n - **Purpose**: Validators play a critical role in securing the Polkadot relay chain. They validate parachain block submissions, participate in consensus, and help maintain the network's overall integrity.\n - **Requirements**: Becoming a validator requires.\n - **Staking**: A variable amount of DOT tokens to secure the network and demonstrate commitment.\n - **Hardware**: High-performing hardware resources capable of supporting intensive blockchain operations.\n - **Technical expertise**: Proficiency in setting up and maintaining nodes, managing updates, and understanding Polkadot's consensus mechanisms.\n - **Community involvement**: Building trust and rapport within the community to attract nominators willing to stake with your validator.\n - **Responsibilities**: Validators have critical responsibilities to ensure network health.\n - **Uptime**: Maintain near-constant availability to avoid slashing penalties for downtime or unresponsiveness.\n - **Network security**: Participate in consensus and verify parachain transactions to uphold the network's security and integrity.\n - **Availability**: Monitor the network for events and respond to issues promptly, such as misbehavior reports or protocol updates."} {"page_id": "infrastructure", "page_title": "Infrastructure", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 2813, "end_char": 2862, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "polkadot-protocol-architecture-parachains-consensus", "page_title": "Parachain Consensus", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 23, "end_char": 936, "estimated_token_count": 146, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nParachains are independent blockchains built with the Polkadot SDK, designed to leverage Polkadot’s relay chain for shared security and transaction finality. These specialized chains operate as part of Polkadot’s execution sharding model, where each parachain manages its own state and transactions while relying on the relay chain for validation and consensus.\n\nAt the core of parachain functionality are collators, specialized nodes that sequence transactions into blocks and maintain the parachain’s state. Collators optimize Polkadot’s architecture by offloading state management from the relay chain, allowing relay chain validators to focus solely on validating parachain blocks.\n\nThis guide explores how parachain consensus works, including the roles of collators and validators, and the steps involved in securing parachain blocks within Polkadot’s scalable and decentralized framework."} @@ -1040,19 +1040,19 @@ {"page_id": "polkadot-protocol-onchain-governance", "page_title": "On-Chain Governance", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 2065, "end_char": 2114, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 12, "end_char": 597, "estimated_token_count": 101, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nAccounts are essential for managing identity, transactions, and governance on the network in the Polkadot SDK. Understanding these components is critical for seamless development and operation on the network, whether you're building or interacting with Polkadot-based chains.\n\nThis page will guide you through the essential aspects of accounts, including their data structure, balance types, reference counters, and address formats. You’ll learn how accounts are managed within the runtime, how balances are categorized, and how addresses are encoded and validated."} {"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 1, "depth": 2, "title": "Account Data Structure", "anchor": "account-data-structure", "start_char": 597, "end_char": 862, "estimated_token_count": 42, "token_estimator": "heuristic-v1", "text": "## Account Data Structure\n\nAccounts are foundational to any blockchain, and the Polkadot SDK provides a flexible management system. This section explains how the Polkadot SDK defines accounts and manages their lifecycle through data structures within the runtime."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 2, "depth": 3, "title": "Account", "anchor": "account", "start_char": 862, "end_char": 3162, "estimated_token_count": 569, "token_estimator": "heuristic-v1", "text": "### Account\n\nThe [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/type.Account.html){target=\\_blank} is a storage map within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\\_blank} that links an account ID to its corresponding data. This structure is fundamental for mapping account-related information within the chain.\n\nThe code snippet below shows how accounts are defined:\n\n```rs\n /// The full account information for a particular account ID.\n \t#[pallet::storage]\n \t#[pallet::getter(fn account)]\n \tpub type Account = StorageMap<\n \t\t_,\n \t\tBlake2_128Concat,\n \t\tT::AccountId,\n \t\tAccountInfo,\n \t\tValueQuery,\n \t>;\n```\n\nThe preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`).\n\nThe `StorageMap` consists of the following parameters:\n\n- **`_`**: Used in macro expansion and acts as a placeholder for the storage prefix type. Tells the macro to insert the default prefix during expansion.\n- **`Blake2_128Concat`**: The hashing function applied to keys in the storage map.\n- **`T: :AccountId`**: Represents the key type, which corresponds to the account’s unique ID.\n- **`AccountInfo`**: The value type stored in the map. For each account ID, the map stores an `AccountInfo` struct containing:\n\n - **`T::Nonce`**: A nonce for the account, which is incremented with each transaction to ensure transaction uniqueness.\n - **`T: :AccountData`**: Custom account data defined by the runtime configuration, which could include balances, locked funds, or other relevant information.\n \n- **`ValueQuery`**: Defines how queries to the storage map behave when no value is found; returns a default value instead of `None`.\n\nFor a detailed explanation of storage maps, see the [`StorageMap`](https://paritytech.github.io/polkadot-sdk/master/frame_support/storage/types/struct.StorageMap.html){target=\\_blank} entry in the Rust docs."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 3, "depth": 3, "title": "Account Info", "anchor": "account-info", "start_char": 3162, "end_char": 5825, "estimated_token_count": 617, "token_estimator": "heuristic-v1", "text": "### Account Info\n\nThe `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules.\n\n```rs\n/// Information of an account.\n#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)]\npub struct AccountInfo {\n\t/// The number of transactions this account has sent.\n\tpub nonce: Nonce,\n\t/// The number of other modules that currently depend on this account's existence. The account\n\t/// cannot be reaped until this is zero.\n\tpub consumers: RefCount,\n\t/// The number of other modules that allow this account to exist. The account may not be reaped\n\t/// until this and `sufficients` are both zero.\n\tpub providers: RefCount,\n\t/// The number of modules that allow this account to exist for their own purposes only. The\n\t/// account may not be reaped until this and `providers` are both zero.\n\tpub sufficients: RefCount,\n\t/// The additional data that belongs to this account. Used to store the balance(s) in a lot of\n\t/// chains.\n\tpub data: AccountData,\n}\n```\n\nThe `AccountInfo` structure includes the following components:\n\n- **`nonce`**: Tracks the number of transactions initiated by the account, which ensures transaction uniqueness and prevents replay attacks.\n- **`consumers`**: Counts how many other modules or pallets rely on this account’s existence. The account cannot be removed from the chain (reaped) until this count reaches zero.\n- **`providers`**: Tracks how many modules permit this account’s existence. An account can only be reaped once both `providers` and `sufficients` are zero.\n- **`sufficients`**: Represents the number of modules that allow the account to exist for internal purposes, independent of any other modules.\n- **`AccountData`**: A flexible data structure that can be customized in the runtime configuration, usually containing balances or other user-specific data.\n\nThis structure helps manage an account's state and prevents its premature removal while it is still referenced by other on-chain data or modules. The [`AccountInfo`](https://paritytech.github.io/polkadot-sdk/master/frame_system/struct.AccountInfo.html){target=\\_blank} structure can vary as long as it satisfies the trait bounds defined by the `AccountData` associated type in the [`frame-system::pallet::Config`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/trait.Config.html){target=\\_blank} trait."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 4, "depth": 3, "title": "Account Reference Counters", "anchor": "account-reference-counters", "start_char": 5825, "end_char": 10918, "estimated_token_count": 1040, "token_estimator": "heuristic-v1", "text": "### Account Reference Counters\n\nPolkadot SDK uses reference counters to track an account’s dependencies across different runtime modules. These counters ensure that accounts remain active while data is associated with them.\n\nThe reference counters include:\n\n- **`consumers`**: Prevents account removal while other pallets still rely on the account.\n- **`providers`**: Ensures an account is active before other pallets store data related to it.\n- **`sufficients`**: Indicates the account’s independence, ensuring it can exist even without a native token balance, such as when holding sufficient alternative assets.\n\n#### Providers Reference Counters\n\nThe `providers` counter ensures that an account is ready to be depended upon by other runtime modules. For example, it is incremented when an account has a balance above the existential deposit, which marks the account as active.\n\nThe system requires this reference counter to be greater than zero for the `consumers` counter to be incremented, ensuring the account is stable before any dependencies are added.\n\n#### Consumers Reference Counters\n\nThe `consumers` counter ensures that the account cannot be reaped until all references to it across the runtime have been removed. This check prevents the accidental deletion of accounts that still have active on-chain data.\n\nIt is the user’s responsibility to clear out any data from other runtime modules if they wish to remove their account and reclaim their existential deposit.\n\n#### Sufficients Reference Counter\n\nThe `sufficients` counter tracks accounts that can exist independently without relying on a native account balance. This is useful for accounts holding other types of assets, like tokens, without needing a minimum balance in the native token.\n\nFor instance, the [Assets pallet](https://paritytech.github.io/polkadot-sdk/master/pallet_assets/index.html){target=\\_blank}, may increment this counter for an account holding sufficient tokens.\n\n#### Account Deactivation\n\nIn Polkadot SDK-based chains, an account is deactivated when its reference counters (such as `providers`, `consumers`, and `sufficient`) reach zero. These counters ensure the account remains active as long as other runtime modules or pallets reference it.\n\nWhen all dependencies are cleared and the counters drop to zero, the account becomes deactivated and may be removed from the chain (reaped). This is particularly important in Polkadot SDK-based blockchains, where accounts with balances below the existential deposit threshold are pruned from storage to conserve state resources.\n\nEach pallet that references an account has cleanup functions that decrement these counters when the pallet no longer depends on the account. Once these counters reach zero, the account is marked for deactivation.\n\n#### Updating Counters\n\nThe Polkadot SDK provides runtime developers with various methods to manage account lifecycle events, such as deactivation or incrementing reference counters. These methods ensure that accounts cannot be reaped while still in use.\n\nThe following helper functions manage these counters:\n\n- **`inc_consumers()`**: Increments the `consumer` reference counter for an account, signaling that another pallet depends on it.\n- **`dec_consumers()`**: Decrements the `consumer` reference counter, signaling that a pallet no longer relies on the account.\n- **`inc_providers()`**: Increments the `provider` reference counter, ensuring the account remains active.\n- **`dec_providers()`**: Decrements the `provider` reference counter, allowing for account deactivation when no longer in use.\n- **`inc_sufficients()`**: Increments the `sufficient` reference counter for accounts that hold sufficient assets.\n- **`dec_sufficients()`**: Decrements the `sufficient` reference counter.\n\nTo ensure proper account cleanup and lifecycle management, a corresponding decrement should be made for each increment action.\n\nThe `System` pallet offers three query functions to assist developers in tracking account states:\n\n- **[`can_inc_consumer()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.can_inc_consumer){target=\\_blank}**: Checks if the account can safely increment the consumer reference.\n- **[`can_dec_provider()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.can_dec_provider){target=\\_blank}**: Ensures that no consumers exist before allowing the decrement of the provider counter.\n- **[`is_provider_required()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.is_provider_required){target=\\_blank}**: Verifies whether the account still has any active consumer references.\n\nThis modular and flexible system of reference counters tightly controls the lifecycle of accounts in Polkadot SDK-based blockchains, preventing the accidental removal or retention of unneeded accounts. You can refer to the [System pallet Rust docs](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html){target=\\_blank} for more details."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 5, "depth": 2, "title": "Account Balance Types", "anchor": "account-balance-types", "start_char": 10918, "end_char": 12838, "estimated_token_count": 465, "token_estimator": "heuristic-v1", "text": "## Account Balance Types\n\nIn the Polkadot ecosystem, account balances are categorized into different types based on how the funds are utilized and their availability. These balance types determine the actions that can be performed, such as transferring tokens, paying transaction fees, or participating in governance activities. Understanding these balance types helps developers manage user accounts and implement balance-dependent logic.\n\n!!! note \"A more efficient distribution of account balance types is in development\"\n Soon, pallets in the Polkadot SDK will implement the [`Fungible` trait](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/fungible/index.html){target=\\_blank} (see the [tracking issue](https://github.com/paritytech/polkadot-sdk/issues/226){target=\\_blank} for more details). For example, the [`transaction-storage`](https://paritytech.github.io/polkadot-sdk/master/pallet_transaction_storage/index.html){target=\\_blank} pallet changed the implementation of the [`Currency`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/currency/index.html){target=\\_blank} trait (see the [Refactor transaction storage pallet to use fungible traits](https://github.com/paritytech/polkadot-sdk/pull/1800){target=\\_blank} PR for further details):\n\n ```rust\n type BalanceOf = <::Currency as Currency<::AccountId>>::Balance;\n ```\n \n To the [`Fungible`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/fungible/index.html){target=\\_blank} trait:\n\n ```rust\n type BalanceOf = <::Currency as FnInspect<::AccountId>>::Balance;\n ```\n \n This update will enable more efficient use of account balances, allowing the free balance to be utilized for on-chain activities such as setting proxies and managing identities."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 6, "depth": 3, "title": "Balance Types", "anchor": "balance-types", "start_char": 12838, "end_char": 15313, "estimated_token_count": 601, "token_estimator": "heuristic-v1", "text": "### Balance Types\n\nThe five main balance types are:\n\n- **Free balance**: Represents the total tokens available to the account for any on-chain activity, including staking, governance, and voting. However, it may not be fully spendable or transferrable if portions of it are locked or reserved.\n- **Locked balance**: Portions of the free balance that cannot be spent or transferred because they are tied up in specific activities like [staking](https://wiki.polkadot.com/learn/learn-staking/#nominating-validators){target=\\_blank}, [vesting](https://wiki.polkadot.com/learn/learn-guides-transfers/#vested-transfers-with-the-polkadot-js-ui){target=\\_blank}, or participating in [governance](https://wiki.polkadot.com/learn/learn-polkadot-opengov/#voting-on-a-referendum){target=\\_blank}. While the tokens remain part of the free balance, they are non-transferable for the duration of the lock.\n- **Reserved balance**: Funds locked by specific system actions, such as setting up an [identity](https://wiki.polkadot.com/learn/learn-identity/){target=\\_blank}, creating [proxies](https://wiki.polkadot.com/learn/learn-proxies/){target=\\_blank}, or submitting [deposits for governance proposals](https://wiki.polkadot.com/learn/learn-guides-polkadot-opengov/#claiming-opengov-deposits){target=\\_blank}. These tokens are not part of the free balance and cannot be spent unless they are unreserved.\n- **Spendable balance**: The portion of the free balance that is available for immediate spending or transfers. It is calculated by subtracting the maximum of locked or reserved amounts from the free balance, ensuring that existential deposit limits are met.\n- **Untouchable balance**: Funds that cannot be directly spent or transferred but may still be utilized for on-chain activities, such as governance participation or staking. These tokens are typically tied to certain actions or locked for a specific period.\n\nThe spendable balance is calculated as follows:\n\n```text\nspendable = free - max(locked - reserved, ED)\n```\n\nHere, `free`, `locked`, and `reserved` are defined above. The `ED` represents the [existential deposit](https://wiki.polkadot.com/learn/learn-accounts/#existential-deposit-and-reaping){target=\\_blank}, the minimum balance required to keep an account active and prevent it from being reaped. You may find you can't see all balance types when looking at your account via a wallet. Wallet providers often display only spendable, locked, and reserved balances."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 7, "depth": 3, "title": "Locks", "anchor": "locks", "start_char": 15313, "end_char": 17508, "estimated_token_count": 464, "token_estimator": "heuristic-v1", "text": "### Locks\n\nLocks are applied to an account's free balance, preventing that portion from being spent or transferred. Locks are automatically placed when an account participates in specific on-chain activities, such as staking or governance. Although multiple locks may be applied simultaneously, they do not stack. Instead, the largest lock determines the total amount of locked tokens.\n\nLocks follow these basic rules:\n\n- If different locks apply to varying amounts, the largest lock amount takes precedence.\n- If multiple locks apply to the same amount, the lock with the longest duration governs when the balance can be unlocked.\n\n#### Locks Example\n\nConsider an example where an account has 80 DOT locked for both staking and governance purposes like so:\n\n- 80 DOT is staked with a 28-day lock period.\n- 24 DOT is locked for governance with a 1x conviction and a 7-day lock period.\n- 4 DOT is locked for governance with a 6x conviction and a 224-day lock period.\n\nIn this case, the total locked amount is 80 DOT because only the largest lock (80 DOT from staking) governs the locked balance. These 80 DOT will be released at different times based on the lock durations. In this example, the 24 DOT locked for governance will be released first since the shortest lock period is seven days. The 80 DOT stake with a 28-day lock period is released next. Now, all that remains locked is the 4 DOT for governance. After 224 days, all 80 DOT (minus the existential deposit) will be free and transferable.\n\n![Illustration of Lock Example](/images/polkadot-protocol/parachain-basics/accounts/locks-example-2.webp)\n\n#### Edge Cases for Locks\n\nIn scenarios where multiple convictions and lock periods are active, the lock duration and amount are determined by the longest period and largest amount. For example, if you delegate with different convictions and attempt to undelegate during an active lock period, the lock may be extended for the full amount of tokens. For a detailed discussion on edge case lock behavior, see this [Stack Exchange post](https://substrate.stackexchange.com/questions/5067/delegating-and-undelegating-during-the-lock-period-extends-it-for-the-initial-am){target=\\_blank}."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 8, "depth": 3, "title": "Balance Types on Polkadot.js", "anchor": "balance-types-on-polkadotjs", "start_char": 17508, "end_char": 20573, "estimated_token_count": 611, "token_estimator": "heuristic-v1", "text": "### Balance Types on Polkadot.js\n\nPolkadot.js provides a user-friendly interface for managing and visualizing various account balances on Polkadot and Kusama networks. When interacting with Polkadot.js, you will encounter multiple balance types that are critical for understanding how your funds are distributed and restricted. This section explains how different balances are displayed in the Polkadot.js UI and what each type represents.\n\n![](/images/polkadot-protocol/parachain-basics/accounts/account-balance-types-1.webp)\n\nThe most common balance types displayed on Polkadot.js are:\n\n- **Total balance**: The total number of tokens available in the account. This includes all tokens, whether they are transferable, locked, reserved, or vested. However, the total balance does not always reflect what can be spent immediately. In this example, the total balance is 0.6274 KSM.\n\n- **Transferable balance**: Shows how many tokens are immediately available for transfer. It is calculated by subtracting the locked and reserved balances from the total balance. For example, if an account has a total balance of 0.6274 KSM and a transferable balance of 0.0106 KSM, only the latter amount can be sent or spent freely.\n\n- **Vested balance**: Tokens that allocated to the account but released according to a specific schedule. Vested tokens remain locked and cannot be transferred until fully vested. For example, an account with a vested balance of 0.2500 KSM means that this amount is owned but not yet transferable.\n\n- **Locked balance**: Tokens that are temporarily restricted from being transferred or spent. These locks typically result from participating in staking, governance, or vested transfers. In Polkadot.js, locked balances do not stack—only the largest lock is applied. For instance, if an account has 0.5500 KSM locked for governance and staking, the locked balance would display 0.5500 KSM, not the sum of all locked amounts.\n\n- **Reserved balance**: Refers to tokens locked for specific on-chain actions, such as setting an identity, creating a proxy, or making governance deposits. Reserved tokens are not part of the free balance, but can be freed by performing certain actions. For example, removing an identity would unreserve those funds.\n\n- **Bonded balance**: The tokens locked for staking purposes. Bonded tokens are not transferable until they are unbonded after the unbonding period.\n\n- **Redeemable balance**: The number of tokens that have completed the unbonding period and are ready to be unlocked and transferred again. For example, if an account has a redeemable balance of 0.1000 KSM, those tokens are now available for spending.\n\n- **Democracy balance**: Reflects the number of tokens locked for governance activities, such as voting on referenda. These tokens are locked for the duration of the governance action and are only released after the lock period ends.\n\nBy understanding these balance types and their implications, developers and users can better manage their funds and engage with on-chain activities more effectively."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 9, "depth": 2, "title": "Address Formats", "anchor": "address-formats", "start_char": 20573, "end_char": 21032, "estimated_token_count": 77, "token_estimator": "heuristic-v1", "text": "## Address Formats\n\nThe SS58 address format is a core component of the Polkadot SDK that enables accounts to be uniquely identified across Polkadot-based networks. This format is a modified version of Bitcoin's Base58Check encoding, specifically designed to accommodate the multi-chain nature of the Polkadot ecosystem. SS58 encoding allows each chain to define its own set of addresses while maintaining compatibility and checksum validation for security."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 10, "depth": 3, "title": "Basic Format", "anchor": "basic-format", "start_char": 21032, "end_char": 22274, "estimated_token_count": 295, "token_estimator": "heuristic-v1", "text": "### Basic Format\n\nSS58 addresses consist of three main components:\n\n```text\nbase58encode(concat(,
, ))\n```\n\n- **Address type**: A byte or set of bytes that define the network (or chain) for which the address is intended. This ensures that addresses are unique across different Polkadot SDK-based chains.\n- **Address**: The public key of the account encoded as bytes.\n- **Checksum**: A hash-based checksum which ensures that addresses are valid and unaltered. The checksum is derived from the concatenated address type and address components, ensuring integrity.\n\nThe encoding process transforms the concatenated components into a Base58 string, providing a compact and human-readable format that avoids easily confused characters (e.g., zero '0', capital 'O', lowercase 'l'). This encoding function ([`encode`](https://docs.rs/bs58/latest/bs58/fn.encode.html){target=\\_blank}) is implemented exactly as defined in Bitcoin and IPFS specifications, using the same alphabet as both implementations.\n\nFor more details about the SS58 address format implementation, see the [`Ss58Codec`](https://paritytech.github.io/polkadot-sdk/master/sp_core/crypto/trait.Ss58Codec.html){target=\\_blank} trait in the Rust Docs."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 11, "depth": 3, "title": "Address Type", "anchor": "address-type", "start_char": 22274, "end_char": 23209, "estimated_token_count": 203, "token_estimator": "heuristic-v1", "text": "### Address Type\n\nThe address type defines how an address is interpreted and to which network it belongs. Polkadot SDK uses different prefixes to distinguish between various chains and address formats:\n\n- **Address types `0-63`**: Simple addresses, commonly used for network identifiers.\n- **Address types `64-127`**: Full addresses that support a wider range of network identifiers.\n- **Address types `128-255`**: Reserved for future address format extensions.\n\nFor example, Polkadot’s main network uses an address type of 0, while Kusama uses 2. This ensures that addresses can be used without confusion between networks.\n\nThe address type is always encoded as part of the SS58 address, making it easy to quickly identify the network. Refer to the [SS58 registry](https://github.com/paritytech/ss58-registry){target=\\_blank} for the canonical listing of all address type identifiers and how they map to Polkadot SDK-based networks."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 12, "depth": 3, "title": "Address Length", "anchor": "address-length", "start_char": 23209, "end_char": 24403, "estimated_token_count": 268, "token_estimator": "heuristic-v1", "text": "### Address Length\n\nSS58 addresses can have different lengths depending on the specific format. Address lengths range from as short as 3 to 35 bytes, depending on the complexity of the address and network requirements. This flexibility allows SS58 addresses to adapt to different chains while providing a secure encoding mechanism.\n\n| Total | Type | Raw account | Checksum |\n|-------|------|-------------|----------|\n| 3 | 1 | 1 | 1 |\n| 4 | 1 | 2 | 1 |\n| 5 | 1 | 2 | 2 |\n| 6 | 1 | 4 | 1 |\n| 7 | 1 | 4 | 2 |\n| 8 | 1 | 4 | 3 |\n| 9 | 1 | 4 | 4 |\n| 10 | 1 | 8 | 1 |\n| 11 | 1 | 8 | 2 |\n| 12 | 1 | 8 | 3 |\n| 13 | 1 | 8 | 4 |\n| 14 | 1 | 8 | 5 |\n| 15 | 1 | 8 | 6 |\n| 16 | 1 | 8 | 7 |\n| 17 | 1 | 8 | 8 |\n| 35 | 1 | 32 | 2 |\n\nSS58 addresses also support different payload sizes, allowing a flexible range of account identifiers."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 13, "depth": 3, "title": "Checksum Types", "anchor": "checksum-types", "start_char": 24403, "end_char": 24868, "estimated_token_count": 94, "token_estimator": "heuristic-v1", "text": "### Checksum Types\n\nA checksum is applied to validate SS58 addresses. Polkadot SDK uses a Blake2b-512 hash function to calculate the checksum, which is appended to the address before encoding. The checksum length can vary depending on the address format (e.g., 1-byte, 2-byte, or longer), providing varying levels of validation strength.\n\nThe checksum ensures that an address is not modified or corrupted, adding an extra layer of security for account management."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 14, "depth": 3, "title": "Validating Addresses", "anchor": "validating-addresses", "start_char": 24868, "end_char": 29648, "estimated_token_count": 1074, "token_estimator": "heuristic-v1", "text": "### Validating Addresses\n\nSS58 addresses can be validated using the subkey command-line interface or the Polkadot.js API. These tools help ensure an address is correctly formatted and valid for the intended network. The following sections will provide an overview of how validation works with these tools.\n\n#### Using Subkey\n\n[Subkey](https://paritytech.github.io/polkadot-sdk/master/subkey/index.html){target=\\_blank} is a CLI tool provided by Polkadot SDK for generating and managing keys. It can inspect and validate SS58 addresses.\n\nThe `inspect` command gets a public key and an SS58 address from the provided secret URI. The basic syntax for the `subkey inspect` command is:\n\n```bash\nsubkey inspect [flags] [options] uri\n```\n\nFor the `uri` command-line argument, you can specify the secret seed phrase, a hex-encoded private key, or an SS58 address. If the input is a valid address, the `subkey` program displays the corresponding hex-encoded public key, account identifier, and SS58 addresses.\n\nFor example, to inspect the public keys derived from a secret seed phrase, you can run a command similar to the following:\n\n```bash\nsubkey inspect \"caution juice atom organ advance problem want pledge someone senior holiday very\"\n```\n\nThe command displays output similar to the following:\n\n
\n subkey inspect \"caution juice atom organ advance problem want pledge someone senior holiday very\"\n Secret phrase `caution juice atom organ advance problem want pledge someone senior holiday very` is account:\n Secret seed: 0xc8fa03532fb22ee1f7f6908b9c02b4e72483f0dbd66e4cd456b8f34c6230b849\n Public key (hex): 0xd6a3105d6768e956e9e5d41050ac29843f98561410d3a47f9dd5b3b227ab8746\n Public key (SS58): 5Gv8YYFu8H1btvmrJy9FjjAWfb99wrhV3uhPFoNEr918utyR\n Account ID: 0xd6a3105d6768e956e9e5d41050ac29843f98561410d3a47f9dd5b3b227ab8746\n SS58 Address: 5Gv8YYFu8H1btvmrJy9FjjAWfb99wrhV3uhPFoNEr918utyR\n
\n\nThe `subkey` program assumes an address is based on a public/private key pair. If you inspect an address, the command returns the 32-byte account identifier.\n\nHowever, not all addresses in Polkadot SDK-based networks are based on keys.\n\nDepending on the command-line options you specify and the input you provided, the command output might also display the network for which the address has been encoded. For example:\n\n```bash\nsubkey inspect \"12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\"\n```\n\nThe command displays output similar to the following:\n\n
\n subkey inspect \"12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\"\n Public Key URI `12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU` is account:\n Network ID/Version: polkadot\n Public key (hex): 0x46ebddef8cd9bb167dc30878d7113b7e168e6f0646beffd77d69d39bad76b47a\n Account ID: 0x46ebddef8cd9bb167dc30878d7113b7e168e6f0646beffd77d69d39bad76b47a\n Public key (SS58): 12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\n SS58 Address: 12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\n
\n\n#### Using Polkadot.js API\n\nTo verify an address in JavaScript or TypeScript projects, you can use the functions built into the [Polkadot.js API](https://polkadot.js.org/docs/){target=\\_blank}. For example:\n\n```js\n// Import Polkadot.js API dependencies\nconst { decodeAddress, encodeAddress } = require('@polkadot/keyring');\nconst { hexToU8a, isHex } = require('@polkadot/util');\n\n// Specify an address to test.\nconst address = 'INSERT_ADDRESS_TO_TEST';\n\n// Check address\nconst isValidSubstrateAddress = () => {\n try {\n encodeAddress(isHex(address) ? hexToU8a(address) : decodeAddress(address));\n\n return true;\n } catch (error) {\n return false;\n }\n};\n\n// Query result\nconst isValid = isValidSubstrateAddress();\nconsole.log(isValid);\n\n```\n\nIf the function returns `true`, the specified address is a valid address.\n\n#### Other SS58 Implementations\n\nSupport for encoding and decoding Polkadot SDK SS58 addresses has been implemented in several other languages and libraries.\n\n- **Crystal**: [`wyhaines/base58.cr`](https://github.com/wyhaines/base58.cr){target=\\_blank}\n- **Go**: [`itering/subscan-plugin`](https://github.com/itering/subscan-plugin){target=\\_blank}\n- **Python**: [`polkascan/py-scale-codec`](https://github.com/polkascan/py-scale-codec){target=\\_blank}\n- **TypeScript**: [`subsquid/squid-sdk`](https://github.com/subsquid/squid-sdk){target=\\_blank}"} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 2, "depth": 3, "title": "Account", "anchor": "account", "start_char": 862, "end_char": 2898, "estimated_token_count": 501, "token_estimator": "heuristic-v1", "text": "### Account\n\nThe [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/type.Account.html){target=\\_blank} is a storage map within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\\_blank} that links an account ID to its corresponding data. This structure is fundamental for mapping account-related information within the chain.\n\nThe code snippet below shows how accounts are defined:\n\n```rs\n \n```\n\nThe preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`).\n\nThe `StorageMap` consists of the following parameters:\n\n- **`_`**: Used in macro expansion and acts as a placeholder for the storage prefix type. Tells the macro to insert the default prefix during expansion.\n- **`Blake2_128Concat`**: The hashing function applied to keys in the storage map.\n- **`T: :AccountId`**: Represents the key type, which corresponds to the account’s unique ID.\n- **`AccountInfo`**: The value type stored in the map. For each account ID, the map stores an `AccountInfo` struct containing:\n\n - **`T::Nonce`**: A nonce for the account, which is incremented with each transaction to ensure transaction uniqueness.\n - **`T: :AccountData`**: Custom account data defined by the runtime configuration, which could include balances, locked funds, or other relevant information.\n \n- **`ValueQuery`**: Defines how queries to the storage map behave when no value is found; returns a default value instead of `None`.\n\nFor a detailed explanation of storage maps, see the [`StorageMap`](https://paritytech.github.io/polkadot-sdk/master/frame_support/storage/types/struct.StorageMap.html){target=\\_blank} entry in the Rust docs."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 3, "depth": 3, "title": "Account Info", "anchor": "account-info", "start_char": 2898, "end_char": 4651, "estimated_token_count": 407, "token_estimator": "heuristic-v1", "text": "### Account Info\n\nThe `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules.\n\n```rs\n\n```\n\nThe `AccountInfo` structure includes the following components:\n\n- **`nonce`**: Tracks the number of transactions initiated by the account, which ensures transaction uniqueness and prevents replay attacks.\n- **`consumers`**: Counts how many other modules or pallets rely on this account’s existence. The account cannot be removed from the chain (reaped) until this count reaches zero.\n- **`providers`**: Tracks how many modules permit this account’s existence. An account can only be reaped once both `providers` and `sufficients` are zero.\n- **`sufficients`**: Represents the number of modules that allow the account to exist for internal purposes, independent of any other modules.\n- **`AccountData`**: A flexible data structure that can be customized in the runtime configuration, usually containing balances or other user-specific data.\n\nThis structure helps manage an account's state and prevents its premature removal while it is still referenced by other on-chain data or modules. The [`AccountInfo`](https://paritytech.github.io/polkadot-sdk/master/frame_system/struct.AccountInfo.html){target=\\_blank} structure can vary as long as it satisfies the trait bounds defined by the `AccountData` associated type in the [`frame-system::pallet::Config`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/trait.Config.html){target=\\_blank} trait."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 4, "depth": 3, "title": "Account Reference Counters", "anchor": "account-reference-counters", "start_char": 4651, "end_char": 9744, "estimated_token_count": 1040, "token_estimator": "heuristic-v1", "text": "### Account Reference Counters\n\nPolkadot SDK uses reference counters to track an account’s dependencies across different runtime modules. These counters ensure that accounts remain active while data is associated with them.\n\nThe reference counters include:\n\n- **`consumers`**: Prevents account removal while other pallets still rely on the account.\n- **`providers`**: Ensures an account is active before other pallets store data related to it.\n- **`sufficients`**: Indicates the account’s independence, ensuring it can exist even without a native token balance, such as when holding sufficient alternative assets.\n\n#### Providers Reference Counters\n\nThe `providers` counter ensures that an account is ready to be depended upon by other runtime modules. For example, it is incremented when an account has a balance above the existential deposit, which marks the account as active.\n\nThe system requires this reference counter to be greater than zero for the `consumers` counter to be incremented, ensuring the account is stable before any dependencies are added.\n\n#### Consumers Reference Counters\n\nThe `consumers` counter ensures that the account cannot be reaped until all references to it across the runtime have been removed. This check prevents the accidental deletion of accounts that still have active on-chain data.\n\nIt is the user’s responsibility to clear out any data from other runtime modules if they wish to remove their account and reclaim their existential deposit.\n\n#### Sufficients Reference Counter\n\nThe `sufficients` counter tracks accounts that can exist independently without relying on a native account balance. This is useful for accounts holding other types of assets, like tokens, without needing a minimum balance in the native token.\n\nFor instance, the [Assets pallet](https://paritytech.github.io/polkadot-sdk/master/pallet_assets/index.html){target=\\_blank}, may increment this counter for an account holding sufficient tokens.\n\n#### Account Deactivation\n\nIn Polkadot SDK-based chains, an account is deactivated when its reference counters (such as `providers`, `consumers`, and `sufficient`) reach zero. These counters ensure the account remains active as long as other runtime modules or pallets reference it.\n\nWhen all dependencies are cleared and the counters drop to zero, the account becomes deactivated and may be removed from the chain (reaped). This is particularly important in Polkadot SDK-based blockchains, where accounts with balances below the existential deposit threshold are pruned from storage to conserve state resources.\n\nEach pallet that references an account has cleanup functions that decrement these counters when the pallet no longer depends on the account. Once these counters reach zero, the account is marked for deactivation.\n\n#### Updating Counters\n\nThe Polkadot SDK provides runtime developers with various methods to manage account lifecycle events, such as deactivation or incrementing reference counters. These methods ensure that accounts cannot be reaped while still in use.\n\nThe following helper functions manage these counters:\n\n- **`inc_consumers()`**: Increments the `consumer` reference counter for an account, signaling that another pallet depends on it.\n- **`dec_consumers()`**: Decrements the `consumer` reference counter, signaling that a pallet no longer relies on the account.\n- **`inc_providers()`**: Increments the `provider` reference counter, ensuring the account remains active.\n- **`dec_providers()`**: Decrements the `provider` reference counter, allowing for account deactivation when no longer in use.\n- **`inc_sufficients()`**: Increments the `sufficient` reference counter for accounts that hold sufficient assets.\n- **`dec_sufficients()`**: Decrements the `sufficient` reference counter.\n\nTo ensure proper account cleanup and lifecycle management, a corresponding decrement should be made for each increment action.\n\nThe `System` pallet offers three query functions to assist developers in tracking account states:\n\n- **[`can_inc_consumer()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.can_inc_consumer){target=\\_blank}**: Checks if the account can safely increment the consumer reference.\n- **[`can_dec_provider()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.can_dec_provider){target=\\_blank}**: Ensures that no consumers exist before allowing the decrement of the provider counter.\n- **[`is_provider_required()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.is_provider_required){target=\\_blank}**: Verifies whether the account still has any active consumer references.\n\nThis modular and flexible system of reference counters tightly controls the lifecycle of accounts in Polkadot SDK-based blockchains, preventing the accidental removal or retention of unneeded accounts. You can refer to the [System pallet Rust docs](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html){target=\\_blank} for more details."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 5, "depth": 2, "title": "Account Balance Types", "anchor": "account-balance-types", "start_char": 9744, "end_char": 11664, "estimated_token_count": 465, "token_estimator": "heuristic-v1", "text": "## Account Balance Types\n\nIn the Polkadot ecosystem, account balances are categorized into different types based on how the funds are utilized and their availability. These balance types determine the actions that can be performed, such as transferring tokens, paying transaction fees, or participating in governance activities. Understanding these balance types helps developers manage user accounts and implement balance-dependent logic.\n\n!!! note \"A more efficient distribution of account balance types is in development\"\n Soon, pallets in the Polkadot SDK will implement the [`Fungible` trait](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/fungible/index.html){target=\\_blank} (see the [tracking issue](https://github.com/paritytech/polkadot-sdk/issues/226){target=\\_blank} for more details). For example, the [`transaction-storage`](https://paritytech.github.io/polkadot-sdk/master/pallet_transaction_storage/index.html){target=\\_blank} pallet changed the implementation of the [`Currency`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/currency/index.html){target=\\_blank} trait (see the [Refactor transaction storage pallet to use fungible traits](https://github.com/paritytech/polkadot-sdk/pull/1800){target=\\_blank} PR for further details):\n\n ```rust\n type BalanceOf = <::Currency as Currency<::AccountId>>::Balance;\n ```\n \n To the [`Fungible`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/fungible/index.html){target=\\_blank} trait:\n\n ```rust\n type BalanceOf = <::Currency as FnInspect<::AccountId>>::Balance;\n ```\n \n This update will enable more efficient use of account balances, allowing the free balance to be utilized for on-chain activities such as setting proxies and managing identities."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 6, "depth": 3, "title": "Balance Types", "anchor": "balance-types", "start_char": 11664, "end_char": 14139, "estimated_token_count": 601, "token_estimator": "heuristic-v1", "text": "### Balance Types\n\nThe five main balance types are:\n\n- **Free balance**: Represents the total tokens available to the account for any on-chain activity, including staking, governance, and voting. However, it may not be fully spendable or transferrable if portions of it are locked or reserved.\n- **Locked balance**: Portions of the free balance that cannot be spent or transferred because they are tied up in specific activities like [staking](https://wiki.polkadot.com/learn/learn-staking/#nominating-validators){target=\\_blank}, [vesting](https://wiki.polkadot.com/learn/learn-guides-transfers/#vested-transfers-with-the-polkadot-js-ui){target=\\_blank}, or participating in [governance](https://wiki.polkadot.com/learn/learn-polkadot-opengov/#voting-on-a-referendum){target=\\_blank}. While the tokens remain part of the free balance, they are non-transferable for the duration of the lock.\n- **Reserved balance**: Funds locked by specific system actions, such as setting up an [identity](https://wiki.polkadot.com/learn/learn-identity/){target=\\_blank}, creating [proxies](https://wiki.polkadot.com/learn/learn-proxies/){target=\\_blank}, or submitting [deposits for governance proposals](https://wiki.polkadot.com/learn/learn-guides-polkadot-opengov/#claiming-opengov-deposits){target=\\_blank}. These tokens are not part of the free balance and cannot be spent unless they are unreserved.\n- **Spendable balance**: The portion of the free balance that is available for immediate spending or transfers. It is calculated by subtracting the maximum of locked or reserved amounts from the free balance, ensuring that existential deposit limits are met.\n- **Untouchable balance**: Funds that cannot be directly spent or transferred but may still be utilized for on-chain activities, such as governance participation or staking. These tokens are typically tied to certain actions or locked for a specific period.\n\nThe spendable balance is calculated as follows:\n\n```text\nspendable = free - max(locked - reserved, ED)\n```\n\nHere, `free`, `locked`, and `reserved` are defined above. The `ED` represents the [existential deposit](https://wiki.polkadot.com/learn/learn-accounts/#existential-deposit-and-reaping){target=\\_blank}, the minimum balance required to keep an account active and prevent it from being reaped. You may find you can't see all balance types when looking at your account via a wallet. Wallet providers often display only spendable, locked, and reserved balances."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 7, "depth": 3, "title": "Locks", "anchor": "locks", "start_char": 14139, "end_char": 16334, "estimated_token_count": 464, "token_estimator": "heuristic-v1", "text": "### Locks\n\nLocks are applied to an account's free balance, preventing that portion from being spent or transferred. Locks are automatically placed when an account participates in specific on-chain activities, such as staking or governance. Although multiple locks may be applied simultaneously, they do not stack. Instead, the largest lock determines the total amount of locked tokens.\n\nLocks follow these basic rules:\n\n- If different locks apply to varying amounts, the largest lock amount takes precedence.\n- If multiple locks apply to the same amount, the lock with the longest duration governs when the balance can be unlocked.\n\n#### Locks Example\n\nConsider an example where an account has 80 DOT locked for both staking and governance purposes like so:\n\n- 80 DOT is staked with a 28-day lock period.\n- 24 DOT is locked for governance with a 1x conviction and a 7-day lock period.\n- 4 DOT is locked for governance with a 6x conviction and a 224-day lock period.\n\nIn this case, the total locked amount is 80 DOT because only the largest lock (80 DOT from staking) governs the locked balance. These 80 DOT will be released at different times based on the lock durations. In this example, the 24 DOT locked for governance will be released first since the shortest lock period is seven days. The 80 DOT stake with a 28-day lock period is released next. Now, all that remains locked is the 4 DOT for governance. After 224 days, all 80 DOT (minus the existential deposit) will be free and transferable.\n\n![Illustration of Lock Example](/images/polkadot-protocol/parachain-basics/accounts/locks-example-2.webp)\n\n#### Edge Cases for Locks\n\nIn scenarios where multiple convictions and lock periods are active, the lock duration and amount are determined by the longest period and largest amount. For example, if you delegate with different convictions and attempt to undelegate during an active lock period, the lock may be extended for the full amount of tokens. For a detailed discussion on edge case lock behavior, see this [Stack Exchange post](https://substrate.stackexchange.com/questions/5067/delegating-and-undelegating-during-the-lock-period-extends-it-for-the-initial-am){target=\\_blank}."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 8, "depth": 3, "title": "Balance Types on Polkadot.js", "anchor": "balance-types-on-polkadotjs", "start_char": 16334, "end_char": 19399, "estimated_token_count": 611, "token_estimator": "heuristic-v1", "text": "### Balance Types on Polkadot.js\n\nPolkadot.js provides a user-friendly interface for managing and visualizing various account balances on Polkadot and Kusama networks. When interacting with Polkadot.js, you will encounter multiple balance types that are critical for understanding how your funds are distributed and restricted. This section explains how different balances are displayed in the Polkadot.js UI and what each type represents.\n\n![](/images/polkadot-protocol/parachain-basics/accounts/account-balance-types-1.webp)\n\nThe most common balance types displayed on Polkadot.js are:\n\n- **Total balance**: The total number of tokens available in the account. This includes all tokens, whether they are transferable, locked, reserved, or vested. However, the total balance does not always reflect what can be spent immediately. In this example, the total balance is 0.6274 KSM.\n\n- **Transferable balance**: Shows how many tokens are immediately available for transfer. It is calculated by subtracting the locked and reserved balances from the total balance. For example, if an account has a total balance of 0.6274 KSM and a transferable balance of 0.0106 KSM, only the latter amount can be sent or spent freely.\n\n- **Vested balance**: Tokens that allocated to the account but released according to a specific schedule. Vested tokens remain locked and cannot be transferred until fully vested. For example, an account with a vested balance of 0.2500 KSM means that this amount is owned but not yet transferable.\n\n- **Locked balance**: Tokens that are temporarily restricted from being transferred or spent. These locks typically result from participating in staking, governance, or vested transfers. In Polkadot.js, locked balances do not stack—only the largest lock is applied. For instance, if an account has 0.5500 KSM locked for governance and staking, the locked balance would display 0.5500 KSM, not the sum of all locked amounts.\n\n- **Reserved balance**: Refers to tokens locked for specific on-chain actions, such as setting an identity, creating a proxy, or making governance deposits. Reserved tokens are not part of the free balance, but can be freed by performing certain actions. For example, removing an identity would unreserve those funds.\n\n- **Bonded balance**: The tokens locked for staking purposes. Bonded tokens are not transferable until they are unbonded after the unbonding period.\n\n- **Redeemable balance**: The number of tokens that have completed the unbonding period and are ready to be unlocked and transferred again. For example, if an account has a redeemable balance of 0.1000 KSM, those tokens are now available for spending.\n\n- **Democracy balance**: Reflects the number of tokens locked for governance activities, such as voting on referenda. These tokens are locked for the duration of the governance action and are only released after the lock period ends.\n\nBy understanding these balance types and their implications, developers and users can better manage their funds and engage with on-chain activities more effectively."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 9, "depth": 2, "title": "Address Formats", "anchor": "address-formats", "start_char": 19399, "end_char": 19858, "estimated_token_count": 77, "token_estimator": "heuristic-v1", "text": "## Address Formats\n\nThe SS58 address format is a core component of the Polkadot SDK that enables accounts to be uniquely identified across Polkadot-based networks. This format is a modified version of Bitcoin's Base58Check encoding, specifically designed to accommodate the multi-chain nature of the Polkadot ecosystem. SS58 encoding allows each chain to define its own set of addresses while maintaining compatibility and checksum validation for security."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 10, "depth": 3, "title": "Basic Format", "anchor": "basic-format", "start_char": 19858, "end_char": 21100, "estimated_token_count": 295, "token_estimator": "heuristic-v1", "text": "### Basic Format\n\nSS58 addresses consist of three main components:\n\n```text\nbase58encode(concat(,
, ))\n```\n\n- **Address type**: A byte or set of bytes that define the network (or chain) for which the address is intended. This ensures that addresses are unique across different Polkadot SDK-based chains.\n- **Address**: The public key of the account encoded as bytes.\n- **Checksum**: A hash-based checksum which ensures that addresses are valid and unaltered. The checksum is derived from the concatenated address type and address components, ensuring integrity.\n\nThe encoding process transforms the concatenated components into a Base58 string, providing a compact and human-readable format that avoids easily confused characters (e.g., zero '0', capital 'O', lowercase 'l'). This encoding function ([`encode`](https://docs.rs/bs58/latest/bs58/fn.encode.html){target=\\_blank}) is implemented exactly as defined in Bitcoin and IPFS specifications, using the same alphabet as both implementations.\n\nFor more details about the SS58 address format implementation, see the [`Ss58Codec`](https://paritytech.github.io/polkadot-sdk/master/sp_core/crypto/trait.Ss58Codec.html){target=\\_blank} trait in the Rust Docs."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 11, "depth": 3, "title": "Address Type", "anchor": "address-type", "start_char": 21100, "end_char": 22035, "estimated_token_count": 203, "token_estimator": "heuristic-v1", "text": "### Address Type\n\nThe address type defines how an address is interpreted and to which network it belongs. Polkadot SDK uses different prefixes to distinguish between various chains and address formats:\n\n- **Address types `0-63`**: Simple addresses, commonly used for network identifiers.\n- **Address types `64-127`**: Full addresses that support a wider range of network identifiers.\n- **Address types `128-255`**: Reserved for future address format extensions.\n\nFor example, Polkadot’s main network uses an address type of 0, while Kusama uses 2. This ensures that addresses can be used without confusion between networks.\n\nThe address type is always encoded as part of the SS58 address, making it easy to quickly identify the network. Refer to the [SS58 registry](https://github.com/paritytech/ss58-registry){target=\\_blank} for the canonical listing of all address type identifiers and how they map to Polkadot SDK-based networks."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 12, "depth": 3, "title": "Address Length", "anchor": "address-length", "start_char": 22035, "end_char": 23229, "estimated_token_count": 268, "token_estimator": "heuristic-v1", "text": "### Address Length\n\nSS58 addresses can have different lengths depending on the specific format. Address lengths range from as short as 3 to 35 bytes, depending on the complexity of the address and network requirements. This flexibility allows SS58 addresses to adapt to different chains while providing a secure encoding mechanism.\n\n| Total | Type | Raw account | Checksum |\n|-------|------|-------------|----------|\n| 3 | 1 | 1 | 1 |\n| 4 | 1 | 2 | 1 |\n| 5 | 1 | 2 | 2 |\n| 6 | 1 | 4 | 1 |\n| 7 | 1 | 4 | 2 |\n| 8 | 1 | 4 | 3 |\n| 9 | 1 | 4 | 4 |\n| 10 | 1 | 8 | 1 |\n| 11 | 1 | 8 | 2 |\n| 12 | 1 | 8 | 3 |\n| 13 | 1 | 8 | 4 |\n| 14 | 1 | 8 | 5 |\n| 15 | 1 | 8 | 6 |\n| 16 | 1 | 8 | 7 |\n| 17 | 1 | 8 | 8 |\n| 35 | 1 | 32 | 2 |\n\nSS58 addresses also support different payload sizes, allowing a flexible range of account identifiers."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 13, "depth": 3, "title": "Checksum Types", "anchor": "checksum-types", "start_char": 23229, "end_char": 23694, "estimated_token_count": 94, "token_estimator": "heuristic-v1", "text": "### Checksum Types\n\nA checksum is applied to validate SS58 addresses. Polkadot SDK uses a Blake2b-512 hash function to calculate the checksum, which is appended to the address before encoding. The checksum length can vary depending on the address format (e.g., 1-byte, 2-byte, or longer), providing varying levels of validation strength.\n\nThe checksum ensures that an address is not modified or corrupted, adding an extra layer of security for account management."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 14, "depth": 3, "title": "Validating Addresses", "anchor": "validating-addresses", "start_char": 23694, "end_char": 28474, "estimated_token_count": 1074, "token_estimator": "heuristic-v1", "text": "### Validating Addresses\n\nSS58 addresses can be validated using the subkey command-line interface or the Polkadot.js API. These tools help ensure an address is correctly formatted and valid for the intended network. The following sections will provide an overview of how validation works with these tools.\n\n#### Using Subkey\n\n[Subkey](https://paritytech.github.io/polkadot-sdk/master/subkey/index.html){target=\\_blank} is a CLI tool provided by Polkadot SDK for generating and managing keys. It can inspect and validate SS58 addresses.\n\nThe `inspect` command gets a public key and an SS58 address from the provided secret URI. The basic syntax for the `subkey inspect` command is:\n\n```bash\nsubkey inspect [flags] [options] uri\n```\n\nFor the `uri` command-line argument, you can specify the secret seed phrase, a hex-encoded private key, or an SS58 address. If the input is a valid address, the `subkey` program displays the corresponding hex-encoded public key, account identifier, and SS58 addresses.\n\nFor example, to inspect the public keys derived from a secret seed phrase, you can run a command similar to the following:\n\n```bash\nsubkey inspect \"caution juice atom organ advance problem want pledge someone senior holiday very\"\n```\n\nThe command displays output similar to the following:\n\n
\n subkey inspect \"caution juice atom organ advance problem want pledge someone senior holiday very\"\n Secret phrase `caution juice atom organ advance problem want pledge someone senior holiday very` is account:\n Secret seed: 0xc8fa03532fb22ee1f7f6908b9c02b4e72483f0dbd66e4cd456b8f34c6230b849\n Public key (hex): 0xd6a3105d6768e956e9e5d41050ac29843f98561410d3a47f9dd5b3b227ab8746\n Public key (SS58): 5Gv8YYFu8H1btvmrJy9FjjAWfb99wrhV3uhPFoNEr918utyR\n Account ID: 0xd6a3105d6768e956e9e5d41050ac29843f98561410d3a47f9dd5b3b227ab8746\n SS58 Address: 5Gv8YYFu8H1btvmrJy9FjjAWfb99wrhV3uhPFoNEr918utyR\n
\n\nThe `subkey` program assumes an address is based on a public/private key pair. If you inspect an address, the command returns the 32-byte account identifier.\n\nHowever, not all addresses in Polkadot SDK-based networks are based on keys.\n\nDepending on the command-line options you specify and the input you provided, the command output might also display the network for which the address has been encoded. For example:\n\n```bash\nsubkey inspect \"12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\"\n```\n\nThe command displays output similar to the following:\n\n
\n subkey inspect \"12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\"\n Public Key URI `12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU` is account:\n Network ID/Version: polkadot\n Public key (hex): 0x46ebddef8cd9bb167dc30878d7113b7e168e6f0646beffd77d69d39bad76b47a\n Account ID: 0x46ebddef8cd9bb167dc30878d7113b7e168e6f0646beffd77d69d39bad76b47a\n Public key (SS58): 12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\n SS58 Address: 12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\n
\n\n#### Using Polkadot.js API\n\nTo verify an address in JavaScript or TypeScript projects, you can use the functions built into the [Polkadot.js API](https://polkadot.js.org/docs/){target=\\_blank}. For example:\n\n```js\n// Import Polkadot.js API dependencies\nconst { decodeAddress, encodeAddress } = require('@polkadot/keyring');\nconst { hexToU8a, isHex } = require('@polkadot/util');\n\n// Specify an address to test.\nconst address = 'INSERT_ADDRESS_TO_TEST';\n\n// Check address\nconst isValidSubstrateAddress = () => {\n try {\n encodeAddress(isHex(address) ? hexToU8a(address) : decodeAddress(address));\n\n return true;\n } catch (error) {\n return false;\n }\n};\n\n// Query result\nconst isValid = isValidSubstrateAddress();\nconsole.log(isValid);\n\n```\n\nIf the function returns `true`, the specified address is a valid address.\n\n#### Other SS58 Implementations\n\nSupport for encoding and decoding Polkadot SDK SS58 addresses has been implemented in several other languages and libraries.\n\n- **Crystal**: [`wyhaines/base58.cr`](https://github.com/wyhaines/base58.cr){target=\\_blank}\n- **Go**: [`itering/subscan-plugin`](https://github.com/itering/subscan-plugin){target=\\_blank}\n- **Python**: [`polkascan/py-scale-codec`](https://github.com/polkascan/py-scale-codec){target=\\_blank}\n- **TypeScript**: [`subsquid/squid-sdk`](https://github.com/subsquid/squid-sdk){target=\\_blank}"} {"page_id": "polkadot-protocol-parachain-basics-blocks-transactions-fees-blocks", "page_title": "Blocks", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 10, "end_char": 707, "estimated_token_count": 130, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIn the Polkadot SDK, blocks are fundamental to the functioning of the blockchain, serving as containers for [transactions](/polkadot-protocol/parachain-basics/blocks-transactions-fees/transactions/){target=\\_blank} and changes to the chain's state. Blocks consist of headers and an array of transactions, ensuring the integrity and validity of operations on the network. This guide explores the essential components of a block, the process of block production, and how blocks are validated and imported across the network. By understanding these concepts, developers can better grasp how blockchains maintain security, consistency, and performance within the Polkadot ecosystem."} {"page_id": "polkadot-protocol-parachain-basics-blocks-transactions-fees-blocks", "page_title": "Blocks", "index": 1, "depth": 2, "title": "What is a Block?", "anchor": "what-is-a-block", "start_char": 707, "end_char": 1844, "estimated_token_count": 226, "token_estimator": "heuristic-v1", "text": "## What is a Block?\n\nIn the Polkadot SDK, a block is a fundamental unit that encapsulates both the header and an array of transactions. The block header includes critical metadata to ensure the integrity and sequence of the blockchain. Here's a breakdown of its components:\n\n- **Block height**: Indicates the number of blocks created in the chain so far.\n- **Parent hash**: The hash of the previous block, providing a link to maintain the blockchain's immutability.\n- **Transaction root**: Cryptographic digest summarizing all transactions in the block.\n- **State root**: A cryptographic digest representing the post-execution state.\n- **Digest**: Additional information that can be attached to a block, such as consensus-related messages.\n\nEach transaction is part of a series that is executed according to the runtime's rules. The transaction root is a cryptographic digest of this series, which prevents alterations and enables succinct verification by light clients. This verification process allows light clients to confirm whether a transaction exists in a block with only the block header, avoiding downloading the entire block."} {"page_id": "polkadot-protocol-parachain-basics-blocks-transactions-fees-blocks", "page_title": "Blocks", "index": 2, "depth": 2, "title": "Block Production", "anchor": "block-production", "start_char": 1844, "end_char": 2168, "estimated_token_count": 57, "token_estimator": "heuristic-v1", "text": "## Block Production\n\nWhen an authoring node is authorized to create a new block, it selects transactions from the transaction queue based on priority. This step, known as block production, relies heavily on the executive module to manage the initialization and finalization of blocks. The process is summarized as follows:"} @@ -1239,7 +1239,7 @@ {"page_id": "tutorials-dapps-remark-tutorial", "page_title": "PAPI Account Watcher Tutorial", "index": 7, "depth": 2, "title": "Test the CLI", "anchor": "test-the-cli", "start_char": 6108, "end_char": 7721, "estimated_token_count": 521, "token_estimator": "heuristic-v1", "text": "## Test the CLI\n\nTo test the application, navigate to the [**Extrinsics** page of the PAPI Dev Console](https://dev.papi.how/extrinsics#networkId=westend&endpoint=light-client){target=\\_blank}. Select the **System** pallet and the **remark_with_event** call. Ensure the input field follows the convention `address+email`. For example, if monitoring `5GrwvaEF5zXb26Fz9rcQpDWS57CtERHpNehXCPcNoHGKutQY`, the input should be:\n\n![](/images/tutorials/dapps/remark-tutorial/papi-console.webp)\n\nSubmit the extrinsic and sign it using the Polkadot.js browser wallet. The CLI will display the following output and play the \"You've Got Mail!\" sound:\n\n
\n npm start -- --account 5GrwvaEF5zXb26Fz9rcQpDWS57CtERHpNehXCPcNoHGKutQY\n __ __ _ _____ __ __ _ _ __ __ _ _\n \\ \\ / /__| |__|___ / | \\/ | __ _(_) | \\ \\ / /_ _| |_ ___| |__ ___ _ __\n \\ \\ /\\ / / _ \\ '_ \\ |_ \\ | |\\/| |/ _` | | | \\ \\ /\\ / / _` | __/ __| '_ \\ / _ \\ '__|\n \\ V V / __/ |_) |__) | | | | | (_| | | | \\ V V / (_| | || (__| | | | __/ |\n \\_/\\_/ \\___|_.__/____/ |_| |_|\\__,_|_|_| \\_/\\_/ \\__,_|\\__\\___|_| |_|\\___|_|\n \n 📬 Watching account: 5Cm8yiG45rqrpyV2zPLrbtr8efksrRuCXcqcB4xj8AejfcTB\n 📥 You've got mail!\n 👤 From: 5Cm8yiG45rqrpyV2zPLrbtr8efksrRuCXcqcB4xj8AejfcTB\n 🔖 Hash: 0xb6999c9082f5b1dede08b387404c9eb4eb2deee4781415dfa7edf08b87472050\n
"} {"page_id": "tutorials-dapps-remark-tutorial", "page_title": "PAPI Account Watcher Tutorial", "index": 8, "depth": 2, "title": "Next Steps", "anchor": "next-steps", "start_char": 7721, "end_char": 8055, "estimated_token_count": 69, "token_estimator": "heuristic-v1", "text": "## Next Steps\n\nThis application demonstrates how the Polkadot API can be used to build decentralized applications. While this is not a production-grade application, it introduces several key features for developing with the Polkadot API.\n\nTo explore more, refer to the [official PAPI documentation](https://papi.how){target=\\_blank}."} {"page_id": "tutorials-dapps", "page_title": "Decentralized Application Tutorials", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 491, "end_char": 541, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-dapps", "page_title": "Decentralized Application Tutorials", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 541, "end_char": 1220, "estimated_token_count": 190, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-dapps", "page_title": "Decentralized Application Tutorials", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 541, "end_char": 1198, "estimated_token_count": 184, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-interoperability-replay-and-dry-run-xcms", "page_title": "Replay and Dry Run XCMs", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 44, "end_char": 735, "estimated_token_count": 150, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIn this tutorial, you'll learn how to replay and dry-run XCMs using [Chopsticks](/develop/toolkit/parachains/fork-chains/chopsticks/get-started/){target=\\_blank}, a powerful tool for forking live Polkadot SDK-based chains in your local environment. These techniques are essential for:\n\n- Debugging cross-chain message failures.\n- Tracing execution across relay chains and parachains.\n- Analyzing weight usage, error types, and message flow.\n- Safely simulating XCMs without committing state changes.\n\nBy the end of this guide, you'll be able to set up a local fork, capture and replay real XCMs, and use dry-run features to diagnose and resolve complex cross-chain issues."} {"page_id": "tutorials-interoperability-replay-and-dry-run-xcms", "page_title": "Replay and Dry Run XCMs", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 735, "end_char": 1478, "estimated_token_count": 199, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore you begin, make sure you have:\n\n- [Chopsticks](/develop/toolkit/parachains/fork-chains/chopsticks/get-started/){target=\\_blank} installed (`npm i -g @acala-network/chopsticks`).\n- Access to the endpoint or genesis file of the parachain you want to fork.\n- The block number or hash where the XCM was sent.\n- (Optional) A Chopsticks config file for repeated setups.\n\nIf you haven't forked a chain before, see the [Fork a Chain with Chopsticks guide](/tutorials/polkadot-sdk/testing/fork-live-chains/){target=\\_blank} or [Fork a Network Locally using Chopsticks](https://wiki.polkadot.com/learn/learn-guides-test-opengov-proposals/#fork-a-network-locally-using-chopsticks){target=\\_blank} for step-by-step instructions."} {"page_id": "tutorials-interoperability-replay-and-dry-run-xcms", "page_title": "Replay and Dry Run XCMs", "index": 2, "depth": 2, "title": "Set Up Your Project", "anchor": "set-up-your-project", "start_char": 1478, "end_char": 2310, "estimated_token_count": 194, "token_estimator": "heuristic-v1", "text": "## Set Up Your Project\n\nLet's start by creating a dedicated workspace for your XCM replay and dry-run experiments.\n\n1. Create a new directory and navigate into it:\n\n ```bash\n mkdir -p replay-xcm-tests\n cd replay-xcm-tests\n ```\n\n2. Initialize a new Node project:\n\n ```bash\n npm init -y\n ```\n\n3. Install Chopsticks globally (recommended to avoid conflicts with local installs):\n\n ```bash\n npm install -g @acala-network/chopsticks@latest\n ```\n\n4. Install TypeScript and related tooling for local development:\n\n ```bash\n npm install --save-dev typescript @types/node tsx\n ```\n\n5. Install the required Polkadot packages:\n\n ```bash\n npm install polkadot-api @polkadot-labs/hdkd @polkadot-labs/hdkd-helpers\n ```\n\n6. Initialize the TypeScript config:\n\n ```bash\n npx tsc --init\n ```"} @@ -1276,7 +1276,7 @@ {"page_id": "tutorials-interoperability-xcm-channels-para-to-system", "page_title": "Opening HRMP Channels with System Parachains", "index": 5, "depth": 3, "title": "Craft and Submit the XCM Message", "anchor": "craft-and-submit-the-xcm-message", "start_char": 3780, "end_char": 7208, "estimated_token_count": 685, "token_estimator": "heuristic-v1", "text": "### Craft and Submit the XCM Message\n\nConnect to parachain 2500 using Polkadot.js Apps to send the XCM message to the relay chain. Input the necessary parameters as illustrated in the image below. Make sure to:\n\n1. Insert your previously encoded `establish_channel_with_system` call data into the **`call`** field.\n2. Provide beneficiary details.\n3. Dispatch the XCM message to the relay chain by clicking the **Submit Transaction** button.\n\n![](/images/tutorials/interoperability/xcm-channels/para-to-system/hrmp-para-to-system-2.webp)\n\n!!! note\n The exact process and parameters for submitting this XCM message may vary depending on your specific parachain and relay chain configurations. Always refer to the most current documentation for your particular network setup.\n\nAfter successfully submitting the XCM message to the relay chain, two [`HrmpSystemChannelOpened`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_parachains/hrmp/pallet/enum.Event.html#variant.HrmpSystemChannelOpened){target=\\_blank} events are emitted, indicating that the channels are now present in storage under [`HrmpOpenChannelRequests`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_parachains/hrmp/pallet/storage_types/struct.HrmpOpenChannelRequests.html){target=\\_blank}. However, the channels are not actually set up until the start of the next session, at which point bidirectional communication between parachain 2500 and system chain 1000 is established.\n\nTo verify this, wait for the next session and then follow these steps:\n\n1. Using Polkadot.js Apps, connect to the relay chain and navigate to the **Developer** dropdown, then select **Chain state**.\n\n ![](/images/tutorials/interoperability/xcm-channels/hrmp-channels-1.webp)\n\n2. Query the HRMP channels:\n\n 1. Select **`hrmp`** from the options.\n 2. Choose the **`hrmpChannels`** call.\n 3. Click the **+** button to execute the query.\n\n ![](/images/tutorials/interoperability/xcm-channels/para-to-system/hrmp-para-to-system-3.webp)\n \n3. Examine the query results. You should see output similar to the following:\n\n ```json\n [\n [\n [\n {\n \"sender\": 1000,\n \"recipient\": 2500\n }\n ],\n {\n \"maxCapacity\": 8,\n \"maxTotalSize\": 8192,\n \"maxMessageSize\": 1048576,\n \"msgCount\": 0,\n \"totalSize\": 0,\n \"mqcHead\": null,\n \"senderDeposit\": 0,\n \"recipientDeposit\": 0\n }\n ],\n [\n [\n {\n \"sender\": 2500,\n \"recipient\": 1000\n }\n ],\n {\n \"maxCapacity\": 8,\n \"maxTotalSize\": 8192,\n \"maxMessageSize\": 1048576,\n \"msgCount\": 0,\n \"totalSize\": 0,\n \"mqcHead\": null,\n \"senderDeposit\": 0,\n \"recipientDeposit\": 0\n }\n ]\n ]\n\n ```\n\nThe output confirms the successful establishment of two HRMP channels:\n\n- From chain 1000 (system chain) to chain 2500 (parachain).\n- From chain 2500 (parachain) to chain 1000 (system chain).\n\nThis bidirectional channel enables direct communication between the system chain and the parachain, allowing for cross-chain message passing."} {"page_id": "tutorials-interoperability-xcm-channels", "page_title": "Tutorials for Managing XCM Channels", "index": 0, "depth": 2, "title": "Understand the Process of Opening Channels", "anchor": "understand-the-process-of-opening-channels", "start_char": 787, "end_char": 1357, "estimated_token_count": 95, "token_estimator": "heuristic-v1", "text": "## Understand the Process of Opening Channels\n\nEach parachain starts with two default unidirectional XCM channels: an upward channel for sending messages to the relay chain, and a downward channel for receiving messages. These channels are implicitly available.\n\nTo enable communication between parachains, explicit HRMP channels must be established by registering them on the relay chain. This process requires a deposit to cover the costs associated with storing message queues on the relay chain. The deposit amount depends on the specific relay chain’s parameters."} {"page_id": "tutorials-interoperability-xcm-channels", "page_title": "Tutorials for Managing XCM Channels", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1357, "end_char": 1407, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-interoperability-xcm-channels", "page_title": "Tutorials for Managing XCM Channels", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1407, "end_char": 1808, "estimated_token_count": 101, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-interoperability-xcm-channels", "page_title": "Tutorials for Managing XCM Channels", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1407, "end_char": 1797, "estimated_token_count": 98, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-interoperability-xcm-fee-estimation", "page_title": "XCM Fee Estimation", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 22, "end_char": 450, "estimated_token_count": 76, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nWhen sending cross-chain messages, ensure that the transaction will be successful not only in the local chain but also in the destination chain and any intermediate chains.\n\nSending cross-chain messages requires estimating the fees for the operation. \n\nThis tutorial will demonstrate how to dry-run and estimate the fees for teleporting assets from the Paseo Asset Hub parachain to the Paseo Bridge Hub chain."} {"page_id": "tutorials-interoperability-xcm-fee-estimation", "page_title": "XCM Fee Estimation", "index": 1, "depth": 2, "title": "Fee Mechanism", "anchor": "fee-mechanism", "start_char": 450, "end_char": 1437, "estimated_token_count": 222, "token_estimator": "heuristic-v1", "text": "## Fee Mechanism\n\nThere are three types of fees that can be charged when sending a cross-chain message:\n\n- **Local execution fees**: Fees charged in the local chain for executing the message.\n- **Delivery fees**: Fees charged for delivering the message to the destination chain.\n- **Remote execution fees**: Fees charged in the destination chain for executing the message.\n\nIf there are multiple intermediate chains, delivery fees and remote execution fees will be charged for each one.\n\nIn this example, you will estimate the fees for teleporting assets from the Paseo Asset Hub parachain to the Paseo Bridge Hub chain. The fee structure will be as follows:\n\n```mermaid\nflowchart LR\n AssetHub[Paseo Asset Hub] -->|Delivery Fees| BridgeHub[Paseo Bridge Hub]\n AssetHub -->|
Local
Execution
Fees| AssetHub\n BridgeHub -->|
Remote
Execution
Fees| BridgeHub\n```\n\nThe overall fees are `local_execution_fees` + `delivery_fees` + `remote_execution_fees`."} {"page_id": "tutorials-interoperability-xcm-fee-estimation", "page_title": "XCM Fee Estimation", "index": 2, "depth": 2, "title": "Environment Setup", "anchor": "environment-setup", "start_char": 1437, "end_char": 3989, "estimated_token_count": 588, "token_estimator": "heuristic-v1", "text": "## Environment Setup\n\nFirst, you need to set up your environment:\n\n1. Create a new directory and initialize the project:\n\n ```bash\n mkdir xcm-fee-estimation && \\\n cd xcm-fee-estimation\n ```\n\n2. Initialize the project:\n\n ```bash\n npm init -y\n ```\n\n3. Install dev dependencies:\n\n ```bash\n npm install --save-dev @types/node@^22.12.0 ts-node@^10.9.2 typescript@^5.7.3\n ```\n\n4. Install dependencies:\n\n ```bash\n npm install --save @polkadot-labs/hdkd@^0.0.13 @polkadot-labs/hdkd-helpers@^0.0.13 polkadot-api@1.9.5\n ```\n\n5. Create TypeScript configuration:\n\n ```bash\n npx tsc --init\n ```\n\n6. Generate the types for the Polkadot API for Paseo Bridge Hub and Paseo Asset Hub:\n\n ```bash\n npx papi add paseoAssetHub -n paseo_asset_hub && \\\n npx papi add paseoBridgeHub -w wss://bridge-hub-paseo.dotters.network\n ```\n\n7. Create a new file called `teleport-ah-to-bridge-hub.ts`:\n\n ```bash\n touch teleport-ah-to-bridge-hub.ts\n ```\n\n8. Import the necessary modules. Add the following code to the `teleport-ah-to-bridge-hub.ts` file:\n\n ```typescript title=\"teleport-ah-to-bridge-hub.ts\"\n import { paseoAssetHub, paseoBridgeHub } from '@polkadot-api/descriptors';\n import { createClient, FixedSizeBinary, Enum } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/node';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedLocation,\n XcmVersionedAssetId,\n XcmV3Junctions,\n XcmV3MultiassetFungibility,\n XcmVersionedXcm,\n XcmV5Instruction,\n XcmV5Junctions,\n XcmV5Junction,\n XcmV5AssetFilter,\n XcmV5WildAsset,\n } from '@polkadot-api/descriptors';\n ```\n\n9. Define constants and a `main` function where you will implement all the logic:\n\n ```typescript title=\"teleport-ah-to-bridge-hub.ts\"\n // 1 PAS = 10^10 units\n const PAS_UNITS = 10_000_000_000n; // 1 PAS\n const PAS_CENTS = 100_000_000n; // 0.01 PAS\n\n // Paseo Asset Hub constants\n const PASEO_ASSET_HUB_RPC_ENDPOINT = 'ws://localhost:8001';\n const ASSET_HUB_ACCOUNT = '15oF4uVJwmo4TdGW7VfQxNLavjCXviqxT9S1MgbjMNHr6Sp5'; // Alice (Paseo Asset Hub)\n\n // Bridge Hub destination\n const BRIDGE_HUB_RPC_ENDPOINT = 'ws://localhost:8000';\n const BRIDGE_HUB_PARA_ID = 1002;\n const BRIDGE_HUB_BENEFICIARY =\n async function main() {\n // Code will go here\n }\n ```\n\nAll the following code explained in the subsequent sections must be added inside the `main` function."} @@ -1299,7 +1299,7 @@ {"page_id": "tutorials-interoperability", "page_title": "Interoperability Tutorials", "index": 0, "depth": 2, "title": "XCM (Cross-Consensus Messaging)", "anchor": "xcm-cross-consensus-messaging", "start_char": 645, "end_char": 894, "estimated_token_count": 43, "token_estimator": "heuristic-v1", "text": "## XCM (Cross-Consensus Messaging)\n\nXCM provides a secure and trustless framework that facilitates communication between parachains, relay chains, and external blockchains, enabling asset transfers, data sharing, and complex cross-chain workflows."} {"page_id": "tutorials-interoperability", "page_title": "Interoperability Tutorials", "index": 1, "depth": 3, "title": "For Parachain Integrators", "anchor": "for-parachain-integrators", "start_char": 894, "end_char": 1363, "estimated_token_count": 100, "token_estimator": "heuristic-v1", "text": "### For Parachain Integrators\n\nLearn to establish and use cross-chain communication channels:\n\n- **[Opening HRMP Channels Between Parachains](/tutorials/interoperability/xcm-channels/para-to-para/)**: Set up uni- and bidirectional messaging channels between parachains.\n- **[Opening HRMP Channels with System Parachains](/tutorials/interoperability/xcm-channels/para-to-system/)**: Establish communication channels with system parachains using optimized XCM messages."} {"page_id": "tutorials-interoperability", "page_title": "Interoperability Tutorials", "index": 2, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1363, "end_char": 1413, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-interoperability", "page_title": "Interoperability Tutorials", "index": 3, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1413, "end_char": 2197, "estimated_token_count": 194, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-interoperability", "page_title": "Interoperability Tutorials", "index": 3, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1413, "end_char": 2175, "estimated_token_count": 188, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-onchain-governance-fast-track-gov-proposal", "page_title": "Fast Track a Governance Proposal", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 36, "end_char": 1714, "estimated_token_count": 314, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nPolkadot's [OpenGov](/polkadot-protocol/onchain-governance/overview){target=\\_blank} is a sophisticated governance mechanism designed to allow the network to evolve gracefully over time, guided by its stakeholders. This system features multiple [tracks](https://wiki.polkadot.com/learn/learn-polkadot-opengov-origins/#origins-and-tracks-info){target=\\_blank} for different types of proposals, each with parameters for approval, support, and confirmation period. While this flexibility is powerful, it also introduces complexity that can lead to failed proposals or unexpected outcomes.\n\nTesting governance proposals before submission is crucial for the ecosystem. This process enhances efficiency by reducing the need for repeated submissions, improves security by identifying potential risks, and allows proposal optimization based on simulated outcomes. It also serves as an educational tool, providing stakeholders with a safe environment to understand the impacts of different voting scenarios. \n\nBy leveraging simulation tools like [Chopsticks](/develop/toolkit/parachains/fork-chains/chopsticks){target=\\_blank}, developers can:\n\n- Simulate the entire lifecycle of a proposal.\n- Test the voting outcomes by varying the support and approval levels.\n- Analyze the effects of a successfully executed proposal on the network's state.\n- Identify and troubleshoot potential issues or unexpected consequences before submitting the proposals.\n\nThis tutorial will guide you through using Chopsticks to test OpenGov proposals thoroughly. This ensures that when you submit a proposal to the live network, you can do so with confidence in its effects and viability."} {"page_id": "tutorials-onchain-governance-fast-track-gov-proposal", "page_title": "Fast Track a Governance Proposal", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 1714, "end_char": 2238, "estimated_token_count": 130, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore proceeding, ensure the following prerequisites are met:\n\n- **Chopsticks installation**: If you have not installed Chopsticks yet, refer to the [Install Chopsticks](/develop/toolkit/parachains/fork-chains/chopsticks/get-started/#install-chopsticks){target=\\_blank} guide for detailed instructions.\n- **Familiarity with key concepts**:\n - [Polkadot.js](/develop/toolkit/api-libraries/polkadot-js-api){target=\\_blank}\n - [OpenGov](/polkadot-protocol/onchain-governance/overview){target=\\_blank}"} {"page_id": "tutorials-onchain-governance-fast-track-gov-proposal", "page_title": "Fast Track a Governance Proposal", "index": 2, "depth": 2, "title": "Set Up the Project", "anchor": "set-up-the-project", "start_char": 2238, "end_char": 3770, "estimated_token_count": 327, "token_estimator": "heuristic-v1", "text": "## Set Up the Project\n\nBefore testing OpenGov proposals, you need to set up your development environment. \nYou'll set up a TypeScript project and install the required dependencies to simulate and evaluate proposals. You'll use Chopsticks to fork the Polkadot network and simulate the proposal lifecycle, while Polkadot.js will be your interface for interacting with the forked network and submitting proposals.\n\nFollow these steps to set up your project:\n\n1. Create a new project directory and navigate into it:\n ```bash\n mkdir opengov-chopsticks && cd opengov-chopsticks\n ```\n\n2. Initialize a new TypeScript project:\n ```bash\n npm init -y \\\n && npm install typescript ts-node @types/node --save-dev \\\n && npx tsc --init\n ```\n\n3. Install the required dependencies:\n ```bash\n npm install @polkadot/api @acala-network/chopsticks\n ```\n\n4. Create a new TypeScript file for your script:\n ```bash\n touch test-proposal.ts\n ```\n\n !!!note\n You'll write your code to simulate and test OpenGov proposals in the `test-proposal.ts` file.\n\n5. Open the `tsconfig.json` file and ensure it includes these compiler options:\n ```json\n {\n \"compilerOptions\": {\n \"module\": \"CommonJS\",\n \"esModuleInterop\": true,\n \"target\": \"esnext\",\n \"moduleResolution\": \"node\",\n \"declaration\": true,\n \"sourceMap\": true,\n \"skipLibCheck\": true,\n \"outDir\": \"dist\",\n \"composite\": true\n }\n }\n\n ```"} @@ -1313,13 +1313,13 @@ {"page_id": "tutorials-onchain-governance-fast-track-gov-proposal", "page_title": "Fast Track a Governance Proposal", "index": 10, "depth": 2, "title": "Summary", "anchor": "summary", "start_char": 92075, "end_char": 92735, "estimated_token_count": 125, "token_estimator": "heuristic-v1", "text": "## Summary\n\nIn this tutorial, you've learned how to use Chopsticks to test OpenGov proposals on a local fork of the Polkadot network. You've set up a TypeScript project, connected to a local fork, submitted a proposal, and forced its execution for testing purposes. This process allows you to:\n\n- Safely experiment with different types of proposals.\n- Test the effects of proposals without affecting the live network.\n- Rapidly iterate and debug your governance ideas.\n\nUsing these techniques, you can develop and refine your proposals before submitting them to the Polkadot network, ensuring they're well-tested and likely to achieve their intended effects."} {"page_id": "tutorials-onchain-governance-fast-track-gov-proposal", "page_title": "Fast Track a Governance Proposal", "index": 11, "depth": 2, "title": "Full Code", "anchor": "full-code", "start_char": 92735, "end_char": 169907, "estimated_token_count": 15583, "token_estimator": "heuristic-v1", "text": "## Full Code\n\nHere's the complete code for the `test-proposal.ts` file, incorporating all the steps we've covered:\n\n??? code \"`test-proposal.ts`\"\n ```typescript\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n // --8<-- [start:imports]\n import '@polkadot/api-augment/polkadot';\n import { FrameSupportPreimagesBounded } from '@polkadot/types/lookup';\n import { blake2AsHex } from '@polkadot/util-crypto';\n import { ApiPromise, Keyring, WsProvider } from '@polkadot/api';\n import { type SubmittableExtrinsic } from '@polkadot/api/types';\n import { ISubmittableResult } from '@polkadot/types/types';\n // --8<-- [end:imports]\n\n // --8<-- [start:connectToFork]\n /**\n * Establishes a connection to the local forked chain.\n *\n * @returns A promise that resolves to an `ApiPromise` instance connected to the local chain.\n */\n async function connectToFork(): Promise {\n const wsProvider = new WsProvider('ws://localhost:8000');\n const api = await ApiPromise.create({ provider: wsProvider });\n await api.isReady;\n console.log(`Connected to chain: ${await api.rpc.system.chain()}`);\n return api;\n }\n // --8<-- [end:connectToFork]\n\n // --8<-- [start:generateProposal]\n /**\n * Generates a proposal by submitting a preimage, creating the proposal, and placing a deposit.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param call - The extrinsic to be executed, encapsulating the specific action to be proposed.\n * @param origin - The origin of the proposal, specifying the source authority (e.g., `{ System: 'Root' }`).\n * @returns A promise that resolves to the proposal ID of the generated proposal.\n *\n */\n async function generateProposal(\n api: ApiPromise,\n call: SubmittableExtrinsic<'promise', ISubmittableResult>,\n origin: any\n ): Promise {\n // Initialize the keyring\n const keyring = new Keyring({ type: 'sr25519' });\n\n // Set up Alice development account\n const alice = keyring.addFromUri('//Alice');\n\n // Get the next available proposal index\n const proposalIndex = (\n await api.query.referenda.referendumCount()\n ).toNumber();\n\n // Execute the batch transaction\n await new Promise(async (resolve) => {\n const unsub = await api.tx.utility\n .batch([\n // Register the preimage for your proposal\n api.tx.preimage.notePreimage(call.method.toHex()),\n // Submit your proposal to the referenda system\n api.tx.referenda.submit(\n origin as any,\n {\n Lookup: {\n Hash: call.method.hash.toHex(),\n len: call.method.encodedLength,\n },\n },\n { At: 0 }\n ),\n // Place the required decision deposit\n api.tx.referenda.placeDecisionDeposit(proposalIndex),\n ])\n .signAndSend(alice, (status: any) => {\n if (status.blockNumber) {\n unsub();\n resolve();\n }\n });\n });\n return proposalIndex;\n }\n // --8<-- [end:generateProposal]\n\n // --8<-- [start:moveScheduledCallTo]\n /**\n * Moves a scheduled call to a specified future block if it matches the given verifier criteria.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param blockCounts - The number of blocks to move the scheduled call forward.\n * @param verifier - A function to verify if a scheduled call matches the desired criteria.\n * @throws An error if no matching scheduled call is found.\n */\n async function moveScheduledCallTo(\n api: ApiPromise,\n blockCounts: number,\n verifier: (call: FrameSupportPreimagesBounded) => boolean\n ) {\n // Get the current block number\n const blockNumber = (await api.rpc.chain.getHeader()).number.toNumber();\n \n // Retrieve the scheduler's agenda entries\n const agenda = await api.query.scheduler.agenda.entries();\n \n // Initialize a flag to track if a matching scheduled call is found\n let found = false;\n \n // Iterate through the scheduler's agenda entries\n for (const agendaEntry of agenda) {\n // Iterate through the scheduled entries in the current agenda entry\n for (const scheduledEntry of agendaEntry[1]) {\n // Check if the scheduled entry is valid and matches the verifier criteria\n if (scheduledEntry.isSome && verifier(scheduledEntry.unwrap().call)) {\n found = true;\n \n // Overwrite the agendaEntry item in storage\n const result = await api.rpc('dev_setStorage', [\n [agendaEntry[0]], // require to ensure unique id\n [\n await api.query.scheduler.agenda.key(blockNumber + blockCounts),\n agendaEntry[1].toHex(),\n ],\n ]);\n \n // Check if the scheduled call has an associated lookup\n if (scheduledEntry.unwrap().maybeId.isSome) {\n // Get the lookup ID\n const id = scheduledEntry.unwrap().maybeId.unwrap().toHex();\n const lookup = await api.query.scheduler.lookup(id);\n\n // Check if the lookup exists\n if (lookup.isSome) {\n // Get the lookup key\n const lookupKey = await api.query.scheduler.lookup.key(id);\n \n // Create a new lookup object with the updated block number\n const fastLookup = api.registry.createType('Option<(u32,u32)>', [\n blockNumber + blockCounts,\n 0,\n ]);\n \n // Overwrite the lookup entry in storage\n const result = await api.rpc('dev_setStorage', [\n [lookupKey, fastLookup.toHex()],\n ]);\n }\n }\n }\n }\n }\n \n // Throw an error if no matching scheduled call is found\n if (!found) {\n throw new Error('No scheduled call found');\n }\n }\n // --8<-- [end:moveScheduledCallTo]\n\n // --8<-- [start:forceProposalExecution]\n /**\n * Forces the execution of a specific proposal by updating its referendum state and ensuring the execution process is triggered.\n *\n * @param api - An instance of the Polkadot.js API promise used to interact with the blockchain.\n * @param proposalIndex - The index of the proposal to be executed.\n * @throws An error if the referendum is not found or not in an ongoing state.\n */\n async function forceProposalExecution(api: ApiPromise, proposalIndex: number) {\n // Retrieve the referendum data for the given proposal index\n const referendumData = await api.query.referenda.referendumInfoFor(\n proposalIndex\n );\n // Get the storage key for the referendum data\n const referendumKey =\n api.query.referenda.referendumInfoFor.key(proposalIndex);\n\n // Check if the referendum data exists\n if (!referendumData.isSome) {\n throw new Error(`Referendum ${proposalIndex} not found`);\n }\n\n const referendumInfo = referendumData.unwrap();\n\n // Check if the referendum is in an ongoing state\n if (!referendumInfo.isOngoing) {\n throw new Error(`Referendum ${proposalIndex} is not ongoing`);\n }\n\n // Get the ongoing referendum data\n const ongoingData = referendumInfo.asOngoing;\n // Convert the ongoing data to JSON\n const ongoingJson = ongoingData.toJSON();\n\n // Support Lookup, Inline or Legacy proposals\n const callHash = ongoingData.proposal.isLookup\n ? ongoingData.proposal.asLookup.toHex()\n : ongoingData.proposal.isInline\n ? blake2AsHex(ongoingData.proposal.asInline.toHex())\n : ongoingData.proposal.asLegacy.toHex();\n\n // Get the total issuance of the native token\n const totalIssuance = (await api.query.balances.totalIssuance()).toBigInt();\n\n // Get the current block number\n const proposalBlockTarget = (\n await api.rpc.chain.getHeader()\n ).number.toNumber();\n\n // Create a new proposal data object with the updated fields\n const fastProposalData = {\n ongoing: {\n ...ongoingJson,\n enactment: { after: 0 },\n deciding: {\n since: proposalBlockTarget - 1,\n confirming: proposalBlockTarget - 1,\n },\n tally: {\n ayes: totalIssuance - 1n,\n nays: 0,\n support: totalIssuance - 1n,\n },\n alarm: [proposalBlockTarget + 1, [proposalBlockTarget + 1, 0]],\n },\n };\n\n // Create a new proposal object from the proposal data\n let fastProposal;\n try {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n } catch {\n fastProposal = api.registry.createType(\n `Option`,\n fastProposalData\n );\n }\n\n // Update the storage with the new proposal object\n const result = await api.rpc('dev_setStorage', [\n [referendumKey, fastProposal.toHex()],\n ]);\n\n // Fast forward the nudge referendum to the next block to get the refendum to be scheduled\n await moveScheduledCallTo(api, 1, (call) => {\n if (!call.isInline) {\n return false;\n }\n\n const callData = api.createType('Call', call.asInline.toHex());\n\n return (\n callData.method == 'nudgeReferendum' &&\n (callData.args[0] as any).toNumber() == proposalIndex\n );\n });\n\n // Create a new block\n await api.rpc('dev_newBlock', { count: 1 });\n\n // Move the scheduled call to the next block\n await moveScheduledCallTo(api, 1, (call) =>\n call.isLookup\n ? call.asLookup.toHex() == callHash\n : call.isInline\n ? blake2AsHex(call.asInline.toHex()) == callHash\n : call.asLegacy.toHex() == callHash\n );\n\n // Create another new block\n await api.rpc('dev_newBlock', { count: 1 });\n }\n // --8<-- [end:forceProposalExecution]\n\n // --8<-- [start:main]\n const main = async () => {\n // Connect to the forked chain\n const api = await connectToFork();\n\n // Select the call to perform\n const call = api.tx.system.setCodeWithoutChecks('0x1234');\n\n // Select the origin\n const origin = {\n System: 'Root',\n };\n\n // Submit preimage, submit proposal, and place decision deposit\n const proposalIndex = await generateProposal(api, call, origin);\n\n // Force the proposal to be executed\n await forceProposalExecution(api, proposalIndex);\n\n process.exit(0);\n };\n // --8<-- [end:main]\n\n // --8<-- [start:try-catch-block]\n try {\n main();\n } catch (e) {\n console.log(e);\n process.exit(1);\n }\n // --8<-- [end:try-catch-block]\n\n ```"} {"page_id": "tutorials-onchain-governance", "page_title": "On-Chain Governance Tutorials", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 405, "end_char": 455, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-onchain-governance", "page_title": "On-Chain Governance Tutorials", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 455, "end_char": 883, "estimated_token_count": 117, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-onchain-governance", "page_title": "On-Chain Governance Tutorials", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 455, "end_char": 872, "estimated_token_count": 114, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 30, "end_char": 866, "estimated_token_count": 192, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIn previous tutorials, you learned how to [create a custom pallet](/tutorials/polkadot-sdk/parachains/zero-to-hero/build-custom-pallet/){target=\\_blank} and [test it](/tutorials/polkadot-sdk/parachains/zero-to-hero/pallet-unit-testing/){target=\\_blank}. The next step is to include this pallet in your runtime, integrating it into the core logic of your blockchain.\n\nThis tutorial will guide you through adding two pallets to your runtime: the custom pallet you previously developed and the [utility pallet](https://paritytech.github.io/polkadot-sdk/master/pallet_utility/index.html){target=\\_blank}. This standard Polkadot SDK pallet provides powerful dispatch functionality. The utility pallet offers, for example, batch dispatch, a stateless operation that enables executing multiple calls in a single transaction."} -{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 1, "depth": 2, "title": "Add the Pallets as Dependencies", "anchor": "add-the-pallets-as-dependencies", "start_char": 866, "end_char": 8510, "estimated_token_count": 1856, "token_estimator": "heuristic-v1", "text": "## Add the Pallets as Dependencies\n\nFirst, you'll update the runtime's `Cargo.toml` file to include the Utility pallet and your custom pallets as dependencies for the runtime. Follow these steps:\n\n1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line:\n\n ```toml hl_lines=\"4\" title=\"runtime/Cargo.toml\"\n [dependencies]\n ...\n polkadot-sdk = { workspace = true, features = [\n \"pallet-utility\",\n ...\n ], default-features = false }\n ```\n\n2. In the same `[dependencies]` section, add the custom pallet that you built from scratch with the following line:\n\n ```toml hl_lines=\"3\" title=\"Cargo.toml\"\n [dependencies]\n ...\n custom-pallet = { path = \"../pallets/custom-pallet\", default-features = false }\n ```\n\n3. In the `[features]` section, add the custom pallet to the `std` feature list:\n\n ```toml hl_lines=\"5\" title=\"Cargo.toml\"\n [features]\n default = [\"std\"]\n std = [\n ...\n \"custom-pallet/std\",\n ...\n ]\n ```\n\n3. Save the changes and close the `Cargo.toml` file.\n\n Once you have saved your file, it should look like the following:\n\n ???- code \"runtime/Cargo.toml\"\n \n ```rust title=\"runtime/Cargo.toml\"\n [package]\n name = \"parachain-template-runtime\"\n description = \"A parachain runtime template built with Substrate and Cumulus, part of Polkadot Sdk.\"\n version = \"0.1.0\"\n license = \"Unlicense\"\n authors.workspace = true\n homepage.workspace = true\n repository.workspace = true\n edition.workspace = true\n publish = false\n\n [package.metadata.docs.rs]\n targets = [\"x86_64-unknown-linux-gnu\"]\n\n [build-dependencies]\n docify = { workspace = true }\n substrate-wasm-builder = { optional = true, workspace = true, default-features = true }\n\n [dependencies]\n codec = { features = [\"derive\"], workspace = true }\n cumulus-pallet-parachain-system.workspace = true\n docify = { workspace = true }\n hex-literal = { optional = true, workspace = true, default-features = true }\n log = { workspace = true }\n pallet-parachain-template = { path = \"../pallets/template\", default-features = false }\n polkadot-sdk = { workspace = true, features = [\n \"pallet-utility\",\n \"cumulus-pallet-aura-ext\",\n \"cumulus-pallet-session-benchmarking\",\n \"cumulus-pallet-weight-reclaim\",\n \"cumulus-pallet-xcm\",\n \"cumulus-pallet-xcmp-queue\",\n \"cumulus-primitives-aura\",\n \"cumulus-primitives-core\",\n \"cumulus-primitives-utility\",\n \"pallet-aura\",\n \"pallet-authorship\",\n \"pallet-balances\",\n \"pallet-collator-selection\",\n \"pallet-message-queue\",\n \"pallet-session\",\n \"pallet-sudo\",\n \"pallet-timestamp\",\n \"pallet-transaction-payment\",\n \"pallet-transaction-payment-rpc-runtime-api\",\n \"pallet-xcm\",\n \"parachains-common\",\n \"polkadot-parachain-primitives\",\n \"polkadot-runtime-common\",\n \"runtime\",\n \"staging-parachain-info\",\n \"staging-xcm\",\n \"staging-xcm-builder\",\n \"staging-xcm-executor\",\n ], default-features = false }\n scale-info = { features = [\"derive\"], workspace = true }\n serde_json = { workspace = true, default-features = false, features = [\n \"alloc\",\n ] }\n smallvec = { workspace = true, default-features = true }\n\n custom-pallet = { path = \"../pallets/custom-pallet\", default-features = false }\n\n [features]\n default = [\"std\"]\n std = [\n \"codec/std\",\n \"cumulus-pallet-parachain-system/std\",\n \"log/std\",\n \"pallet-parachain-template/std\",\n \"polkadot-sdk/std\",\n \"scale-info/std\",\n \"serde_json/std\",\n \"substrate-wasm-builder\",\n \"custom-pallet/std\",\n ]\n\n runtime-benchmarks = [\n \"cumulus-pallet-parachain-system/runtime-benchmarks\",\n \"hex-literal\",\n \"pallet-parachain-template/runtime-benchmarks\",\n \"polkadot-sdk/runtime-benchmarks\",\n ]\n\n try-runtime = [\n \"cumulus-pallet-parachain-system/try-runtime\",\n \"pallet-parachain-template/try-runtime\",\n \"polkadot-sdk/try-runtime\",\n ]\n\n # Enable the metadata hash generation.\n #\n # This is hidden behind a feature because it increases the compile time.\n # The wasm binary needs to be compiled twice, once to fetch the metadata,\n # generate the metadata hash and then a second time with the\n # `RUNTIME_METADATA_HASH` environment variable set for the `CheckMetadataHash`\n # extension.\n metadata-hash = [\"substrate-wasm-builder/metadata-hash\"]\n\n # A convenience feature for enabling things when doing a build\n # for an on-chain release.\n on-chain-release-build = [\"metadata-hash\"]\n\n ```\n\nUpdate your root parachain template's `Cargo.toml` file to include your custom pallet as a dependency. Follow these steps:\n\n1. Open the `./Cargo.toml` file and locate the `[workspace]` section. \n \n Make sure the `custom-pallet` is a member of the workspace:\n\n ```toml hl_lines=\"4\" title=\"Cargo.toml\"\n [workspace]\n default-members = [\"pallets/template\", \"runtime\"]\n members = [\n \"node\", \"pallets/custom-pallet\",\n \"pallets/template\",\n \"runtime\",\n ]\n ```\n\n???- code \"./Cargo.toml\"\n\n ```rust title=\"./Cargo.toml\"\n [workspace.package]\n license = \"MIT-0\"\n authors = [\"Parity Technologies \"]\n homepage = \"https://paritytech.github.io/polkadot-sdk/\"\n repository = \"https://github.com/paritytech/polkadot-sdk-parachain-template.git\"\n edition = \"2021\"\n\n [workspace]\n default-members = [\"pallets/template\", \"runtime\"]\n members = [\n \"node\", \"pallets/custom-pallet\",\n \"pallets/template\",\n \"runtime\",\n ]\n resolver = \"2\"\n\n [workspace.dependencies]\n parachain-template-runtime = { path = \"./runtime\", default-features = false }\n pallet-parachain-template = { path = \"./pallets/template\", default-features = false }\n clap = { version = \"4.5.13\" }\n color-print = { version = \"0.3.4\" }\n docify = { version = \"0.2.9\" }\n futures = { version = \"0.3.31\" }\n jsonrpsee = { version = \"0.24.3\" }\n log = { version = \"0.4.22\", default-features = false }\n polkadot-sdk = { version = \"2503.0.1\", default-features = false }\n prometheus-endpoint = { version = \"0.17.2\", default-features = false, package = \"substrate-prometheus-endpoint\" }\n serde = { version = \"1.0.214\", default-features = false }\n codec = { version = \"3.7.4\", default-features = false, package = \"parity-scale-codec\" }\n cumulus-pallet-parachain-system = { version = \"0.20.0\", default-features = false }\n hex-literal = { version = \"0.4.1\", default-features = false }\n scale-info = { version = \"2.11.6\", default-features = false }\n serde_json = { version = \"1.0.132\", default-features = false }\n smallvec = { version = \"1.11.0\", default-features = false }\n substrate-wasm-builder = { version = \"26.0.1\", default-features = false }\n frame = { version = \"0.9.1\", default-features = false, package = \"polkadot-sdk-frame\" }\n\n [profile.release]\n opt-level = 3\n panic = \"unwind\"\n\n [profile.production]\n codegen-units = 1\n inherits = \"release\"\n lto = true\n ```"} -{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 2, "depth": 3, "title": "Update the Runtime Configuration", "anchor": "update-the-runtime-configuration", "start_char": 8510, "end_char": 10415, "estimated_token_count": 406, "token_estimator": "heuristic-v1", "text": "### Update the Runtime Configuration\n\nConfigure the pallets by implementing their `Config` trait and update the runtime macro to include the new pallets:\n\n1. Add the `OriginCaller` import:\n\n ```rust title=\"mod.rs\" hl_lines=\"8\"\n // Local module imports\n use super::OriginCaller;\n ...\n ```\n\n2. Implement the [`Config`](https://paritytech.github.io/polkadot-sdk/master/pallet_utility/pallet/trait.Config.html){target=\\_blank} trait for both pallets at the end of the `runtime/src/config/mod.rs` file:\n\n ```rust title=\"mod.rs\" hl_lines=\"8-25\"\n ...\n /// Configure the pallet template in pallets/template.\n impl pallet_parachain_template::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type WeightInfo = pallet_parachain_template::weights::SubstrateWeight;\n }\n\n // Configure utility pallet.\n impl pallet_utility::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type RuntimeCall = RuntimeCall;\n type PalletsOrigin = OriginCaller;\n type WeightInfo = pallet_utility::weights::SubstrateWeight;\n }\n // Define counter max value runtime constant.\n parameter_types! {\n pub const CounterMaxValue: u32 = 500;\n }\n\n // Configure custom pallet.\n impl custom_pallet::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type CounterMaxValue = CounterMaxValue;\n }\n ```\n\n3. Locate the `#[frame_support::runtime]` macro in the `runtime/src/lib.rs` file and add the pallets:\n\n ```rust hl_lines=\"9-14\" title=\"lib.rs\"\n #[frame_support::runtime]\n mod runtime {\n #[runtime::runtime]\n #[runtime::derive(\n ...\n )]\n pub struct Runtime;\n #[runtime::pallet_index(51)]\n pub type Utility = pallet_utility;\n\n #[runtime::pallet_index(52)]\n pub type CustomPallet = custom_pallet;\n }\n ```"} -{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 3, "depth": 2, "title": "Recompile the Runtime", "anchor": "recompile-the-runtime", "start_char": 10415, "end_char": 10864, "estimated_token_count": 89, "token_estimator": "heuristic-v1", "text": "## Recompile the Runtime\n\nAfter adding and configuring your pallets in the runtime, the next step is to ensure everything is set up correctly. To do this, recompile the runtime with the following command (make sure you're in the project's root directory):\n\n```bash\ncargo build --release\n```\n\nThis command ensures the runtime compiles without errors, validates the pallet configurations, and prepares the build for subsequent testing or deployment."} -{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 4, "depth": 2, "title": "Run Your Chain Locally", "anchor": "run-your-chain-locally", "start_char": 10864, "end_char": 12337, "estimated_token_count": 373, "token_estimator": "heuristic-v1", "text": "## Run Your Chain Locally\n\nLaunch your parachain locally and start producing blocks:\n\n!!!tip\n Generated chain TestNet specifications include development accounts \"Alice\" and \"Bob.\" These accounts are pre-funded with native parachain currency, allowing you to sign and send TestNet transactions. Take a look at the [Polkadot.js Accounts section](https://polkadot.js.org/apps/#/accounts){target=\\_blank} to view the development accounts for your chain.\n\n1. Create a new chain specification file with the updated runtime:\n\n ```bash\n chain-spec-builder create -t development \\\n --relay-chain paseo \\\n --para-id 1000 \\\n --runtime ./target/release/wbuild/parachain-template-runtime/parachain_template_runtime.compact.compressed.wasm \\\n named-preset development\n ```\n\n2. Start the omni node with the generated chain specification:\n\n ```bash\n polkadot-omni-node --chain ./chain_spec.json --dev\n ```\n\n3. Verify you can interact with the new pallets using the [Polkadot.js Apps](https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/extrinsics){target=\\_blank} interface. Navigate to the **Extrinsics** tab and check that you can see both pallets:\n\n - Utility pallet\n\n ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/add-pallets-to-runtime/add-pallets-to-runtime-1.webp)\n \n\n - Custom pallet\n\n ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/add-pallets-to-runtime/add-pallets-to-runtime-2.webp)"} -{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 5, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 12337, "end_char": 13089, "estimated_token_count": 183, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n
\n\n- Tutorial __Deploy on Paseo TestNet__\n\n ---\n\n Deploy your Polkadot SDK blockchain on Paseo! Follow this step-by-step guide for a seamless journey to a successful TestNet deployment.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/deploy-to-testnet/)\n\n- Tutorial __Pallet Benchmarking (Optional)__\n\n ---\n\n Discover how to measure extrinsic costs and assign precise weights to optimize your pallet for accurate fees and runtime performance.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/pallet-benchmarking/)\n\n
"} +{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 1, "depth": 2, "title": "Add the Pallets as Dependencies", "anchor": "add-the-pallets-as-dependencies", "start_char": 866, "end_char": 6394, "estimated_token_count": 1262, "token_estimator": "heuristic-v1", "text": "## Add the Pallets as Dependencies\n\nFirst, you'll update the runtime's `Cargo.toml` file to include the Utility pallet and your custom pallets as dependencies for the runtime. Follow these steps:\n\n1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line:\n\n ```toml hl_lines=\"4\" title=\"runtime/Cargo.toml\"\n \n ...\n polkadot-sdk = { workspace = true, features = [\n \"pallet-utility\",\n ...\n ], default-features = false }\n ```\n\n2. In the same `[dependencies]` section, add the custom pallet that you built from scratch with the following line:\n\n ```toml hl_lines=\"3\" title=\"Cargo.toml\"\n [dependencies]\n ...\n custom-pallet = { path = \"../pallets/custom-pallet\", default-features = false }\n ```\n\n3. In the `[features]` section, add the custom pallet to the `std` feature list:\n\n ```toml hl_lines=\"5\" title=\"Cargo.toml\"\n [features]\n default = [\"std\"]\n std = [\n ...\n \"custom-pallet/std\",\n ...\n ]\n ```\n\n3. Save the changes and close the `Cargo.toml` file.\n\n Once you have saved your file, it should look like the following:\n\n ???- code \"runtime/Cargo.toml\"\n \n ```rust title=\"runtime/Cargo.toml\"\n [package]\n name = \"parachain-template-runtime\"\n description = \"A parachain runtime template built with Substrate and Cumulus, part of Polkadot Sdk.\"\n version = \"0.1.0\"\n license = \"Unlicense\"\n authors.workspace = true\n homepage.workspace = true\n repository.workspace = true\n edition.workspace = true\n publish = false\n\n [package.metadata.docs.rs]\n targets = [\"x86_64-unknown-linux-gnu\"]\n\n [build-dependencies]\n docify = { workspace = true }\n substrate-wasm-builder = { optional = true, workspace = true, default-features = true }\n\n [dependencies]\n codec = { features = [\"derive\"], workspace = true }\n cumulus-pallet-parachain-system.workspace = true\n docify = { workspace = true }\n hex-literal = { optional = true, workspace = true, default-features = true }\n log = { workspace = true }\n pallet-parachain-template = { path = \"../pallets/template\", default-features = false }\n polkadot-sdk = { workspace = true, features = [\n \"pallet-utility\",\n \"cumulus-pallet-aura-ext\",\n \"cumulus-pallet-session-benchmarking\",\n \"cumulus-pallet-weight-reclaim\",\n \"cumulus-pallet-xcm\",\n \"cumulus-pallet-xcmp-queue\",\n \"cumulus-primitives-aura\",\n \"cumulus-primitives-core\",\n \"cumulus-primitives-utility\",\n \"pallet-aura\",\n \"pallet-authorship\",\n \"pallet-balances\",\n \"pallet-collator-selection\",\n \"pallet-message-queue\",\n \"pallet-session\",\n \"pallet-sudo\",\n \"pallet-timestamp\",\n \"pallet-transaction-payment\",\n \"pallet-transaction-payment-rpc-runtime-api\",\n \"pallet-xcm\",\n \"parachains-common\",\n \"polkadot-parachain-primitives\",\n \"polkadot-runtime-common\",\n \"runtime\",\n \"staging-parachain-info\",\n \"staging-xcm\",\n \"staging-xcm-builder\",\n \"staging-xcm-executor\",\n ], default-features = false }\n scale-info = { features = [\"derive\"], workspace = true }\n serde_json = { workspace = true, default-features = false, features = [\n \"alloc\",\n ] }\n smallvec = { workspace = true, default-features = true }\n\n custom-pallet = { path = \"../pallets/custom-pallet\", default-features = false }\n\n [features]\n default = [\"std\"]\n std = [\n \"codec/std\",\n \"cumulus-pallet-parachain-system/std\",\n \"log/std\",\n \"pallet-parachain-template/std\",\n \"polkadot-sdk/std\",\n \"scale-info/std\",\n \"serde_json/std\",\n \"substrate-wasm-builder\",\n \"custom-pallet/std\",\n ]\n\n runtime-benchmarks = [\n \"cumulus-pallet-parachain-system/runtime-benchmarks\",\n \"hex-literal\",\n \"pallet-parachain-template/runtime-benchmarks\",\n \"polkadot-sdk/runtime-benchmarks\",\n ]\n\n try-runtime = [\n \"cumulus-pallet-parachain-system/try-runtime\",\n \"pallet-parachain-template/try-runtime\",\n \"polkadot-sdk/try-runtime\",\n ]\n\n # Enable the metadata hash generation.\n #\n # This is hidden behind a feature because it increases the compile time.\n # The wasm binary needs to be compiled twice, once to fetch the metadata,\n # generate the metadata hash and then a second time with the\n # `RUNTIME_METADATA_HASH` environment variable set for the `CheckMetadataHash`\n # extension.\n metadata-hash = [\"substrate-wasm-builder/metadata-hash\"]\n\n # A convenience feature for enabling things when doing a build\n # for an on-chain release.\n on-chain-release-build = [\"metadata-hash\"]\n\n ```\n\nUpdate your root parachain template's `Cargo.toml` file to include your custom pallet as a dependency. Follow these steps:\n\n1. Open the `./Cargo.toml` file and locate the `[workspace]` section. \n \n Make sure the `custom-pallet` is a member of the workspace:\n\n ```toml hl_lines=\"4\" title=\"Cargo.toml\"\n \n ```\n\n???- code \"./Cargo.toml\"\n\n ```rust title=\"./Cargo.toml\"\n \n ```"} +{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 2, "depth": 3, "title": "Update the Runtime Configuration", "anchor": "update-the-runtime-configuration", "start_char": 6394, "end_char": 8299, "estimated_token_count": 406, "token_estimator": "heuristic-v1", "text": "### Update the Runtime Configuration\n\nConfigure the pallets by implementing their `Config` trait and update the runtime macro to include the new pallets:\n\n1. Add the `OriginCaller` import:\n\n ```rust title=\"mod.rs\" hl_lines=\"8\"\n // Local module imports\n use super::OriginCaller;\n ...\n ```\n\n2. Implement the [`Config`](https://paritytech.github.io/polkadot-sdk/master/pallet_utility/pallet/trait.Config.html){target=\\_blank} trait for both pallets at the end of the `runtime/src/config/mod.rs` file:\n\n ```rust title=\"mod.rs\" hl_lines=\"8-25\"\n ...\n /// Configure the pallet template in pallets/template.\n impl pallet_parachain_template::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type WeightInfo = pallet_parachain_template::weights::SubstrateWeight;\n }\n\n // Configure utility pallet.\n impl pallet_utility::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type RuntimeCall = RuntimeCall;\n type PalletsOrigin = OriginCaller;\n type WeightInfo = pallet_utility::weights::SubstrateWeight;\n }\n // Define counter max value runtime constant.\n parameter_types! {\n pub const CounterMaxValue: u32 = 500;\n }\n\n // Configure custom pallet.\n impl custom_pallet::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type CounterMaxValue = CounterMaxValue;\n }\n ```\n\n3. Locate the `#[frame_support::runtime]` macro in the `runtime/src/lib.rs` file and add the pallets:\n\n ```rust hl_lines=\"9-14\" title=\"lib.rs\"\n #[frame_support::runtime]\n mod runtime {\n #[runtime::runtime]\n #[runtime::derive(\n ...\n )]\n pub struct Runtime;\n #[runtime::pallet_index(51)]\n pub type Utility = pallet_utility;\n\n #[runtime::pallet_index(52)]\n pub type CustomPallet = custom_pallet;\n }\n ```"} +{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 3, "depth": 2, "title": "Recompile the Runtime", "anchor": "recompile-the-runtime", "start_char": 8299, "end_char": 8748, "estimated_token_count": 89, "token_estimator": "heuristic-v1", "text": "## Recompile the Runtime\n\nAfter adding and configuring your pallets in the runtime, the next step is to ensure everything is set up correctly. To do this, recompile the runtime with the following command (make sure you're in the project's root directory):\n\n```bash\ncargo build --release\n```\n\nThis command ensures the runtime compiles without errors, validates the pallet configurations, and prepares the build for subsequent testing or deployment."} +{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 4, "depth": 2, "title": "Run Your Chain Locally", "anchor": "run-your-chain-locally", "start_char": 8748, "end_char": 10221, "estimated_token_count": 373, "token_estimator": "heuristic-v1", "text": "## Run Your Chain Locally\n\nLaunch your parachain locally and start producing blocks:\n\n!!!tip\n Generated chain TestNet specifications include development accounts \"Alice\" and \"Bob.\" These accounts are pre-funded with native parachain currency, allowing you to sign and send TestNet transactions. Take a look at the [Polkadot.js Accounts section](https://polkadot.js.org/apps/#/accounts){target=\\_blank} to view the development accounts for your chain.\n\n1. Create a new chain specification file with the updated runtime:\n\n ```bash\n chain-spec-builder create -t development \\\n --relay-chain paseo \\\n --para-id 1000 \\\n --runtime ./target/release/wbuild/parachain-template-runtime/parachain_template_runtime.compact.compressed.wasm \\\n named-preset development\n ```\n\n2. Start the omni node with the generated chain specification:\n\n ```bash\n polkadot-omni-node --chain ./chain_spec.json --dev\n ```\n\n3. Verify you can interact with the new pallets using the [Polkadot.js Apps](https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/extrinsics){target=\\_blank} interface. Navigate to the **Extrinsics** tab and check that you can see both pallets:\n\n - Utility pallet\n\n ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/add-pallets-to-runtime/add-pallets-to-runtime-1.webp)\n \n\n - Custom pallet\n\n ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/add-pallets-to-runtime/add-pallets-to-runtime-2.webp)"} +{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 5, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 10221, "end_char": 10973, "estimated_token_count": 183, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n
\n\n- Tutorial __Deploy on Paseo TestNet__\n\n ---\n\n Deploy your Polkadot SDK blockchain on Paseo! Follow this step-by-step guide for a seamless journey to a successful TestNet deployment.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/deploy-to-testnet/)\n\n- Tutorial __Pallet Benchmarking (Optional)__\n\n ---\n\n Discover how to measure extrinsic costs and assign precise weights to optimize your pallet for accurate fees and runtime performance.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/pallet-benchmarking/)\n\n
"} {"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-build-custom-pallet", "page_title": "Build a Custom Pallet", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 25, "end_char": 1088, "estimated_token_count": 214, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIn Polkadot SDK-based blockchains, runtime functionality is built through modular components called [pallets](/polkadot-protocol/glossary#pallet){target=\\_blank}. These pallets are Rust-based runtime modules created using [FRAME (Framework for Runtime Aggregation of Modular Entities)](/develop/parachains/customize-parachain/overview/){target=\\_blank}, a powerful library that simplifies blockchain development by providing specialized macros and standardized patterns for building blockchain logic.\nA pallet encapsulates a specific set of blockchain functionalities, such as managing token balances, implementing governance mechanisms, or creating custom state transitions.\n\nIn this tutorial, you'll learn how to create a custom pallet from scratch. You will develop a simple counter pallet with the following features:\n\n- Users can increment and decrement a counter.\n- Only a [root origin](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/type.Origin.html#variant.Root){target=\\_blank} can set an arbitrary counter value."} {"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-build-custom-pallet", "page_title": "Build a Custom Pallet", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 1088, "end_char": 1378, "estimated_token_count": 85, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nYou'll use the [Polkadot SDK Parachain Template](https://github.com/paritytech/polkadot-sdk/tree/master/templates/parachain){target=\\_blank} created in the [Set Up a Template](/tutorials/polkadot-sdk/parachains/zero-to-hero/set-up-a-template/){target=\\_blank} tutorial."} {"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-build-custom-pallet", "page_title": "Build a Custom Pallet", "index": 2, "depth": 2, "title": "Create a New Project", "anchor": "create-a-new-project", "start_char": 1378, "end_char": 2276, "estimated_token_count": 198, "token_estimator": "heuristic-v1", "text": "## Create a New Project\n\nIn this tutorial, you'll build a custom pallet from scratch to demonstrate the complete workflow, rather than starting with the pre-built `pallet-template`. The first step is to create a new Rust package for your pallet:\n\n1. Navigate to the `pallets` directory in your workspace:\n\n ```bash\n cd pallets\n ```\n\n2. Create a new Rust library project for your custom pallet by running the following command:\n\n ```bash\n cargo new --lib custom-pallet\n ```\n\n3. Enter the new project directory:\n\n ```bash\n cd custom-pallet\n ```\n\n4. Ensure the project was created successfully by checking its structure. The file layout should resemble the following:\n\n ```\n custom-pallet \n ├── Cargo.toml\n └── src\n └── lib.rs\n ```\n\n If the files are in place, your project setup is complete, and you're ready to start building your custom pallet."} @@ -1433,7 +1433,7 @@ {"page_id": "tutorials-polkadot-sdk-system-chains-asset-hub", "page_title": "Asset Hub Tutorials", "index": 0, "depth": 2, "title": "Benefits of Asset Hub", "anchor": "benefits-of-asset-hub", "start_char": 23, "end_char": 1017, "estimated_token_count": 224, "token_estimator": "heuristic-v1", "text": "## Benefits of Asset Hub\n\nPolkadot SDK-based relay chains focus on security and consensus, leaving asset management to an external component, such as a [system chain](/polkadot-protocol/architecture/system-chains/){target=\\_blank}. The [Asset Hub](/polkadot-protocol/architecture/system-chains/asset-hub/){target=\\_blank} is one example of a system chain and is vital to managing tokens which aren't native to the Polkadot ecosystem. Developers opting to integrate with Asset Hub can expect the following benefits:\n\n- **Support for non-native on-chain assets**: Create and manage your own tokens or NFTs with Polkadot ecosystem compatibility available out of the box.\n- **Lower transaction fees**: Approximately 1/10th of the cost of using the relay chain.\n- **Reduced deposit requirements**: Approximately 1/100th of the deposit required for the relay chain.\n- **Payment of fees with non-native assets**: No need to buy native tokens for gas, increasing flexibility for developers and users."} {"page_id": "tutorials-polkadot-sdk-system-chains-asset-hub", "page_title": "Asset Hub Tutorials", "index": 1, "depth": 2, "title": "Get Started", "anchor": "get-started", "start_char": 1017, "end_char": 1303, "estimated_token_count": 48, "token_estimator": "heuristic-v1", "text": "## Get Started\n\nThrough these tutorials, you'll learn how to manage cross-chain assets, including:\n\n- Asset registration and configuration\n- Cross-chain asset representation\n- Liquidity pool creation and management \n- Asset swapping and conversion\n- Transaction parameter optimization"} {"page_id": "tutorials-polkadot-sdk-system-chains-asset-hub", "page_title": "Asset Hub Tutorials", "index": 2, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1303, "end_char": 1353, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-polkadot-sdk-system-chains-asset-hub", "page_title": "Asset Hub Tutorials", "index": 3, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1353, "end_char": 1778, "estimated_token_count": 116, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-polkadot-sdk-system-chains-asset-hub", "page_title": "Asset Hub Tutorials", "index": 3, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1353, "end_char": 1767, "estimated_token_count": 113, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-polkadot-sdk-system-chains", "page_title": "System Chains Tutorials", "index": 0, "depth": 2, "title": "For Parachain Integrators", "anchor": "for-parachain-integrators", "start_char": 619, "end_char": 990, "estimated_token_count": 83, "token_estimator": "heuristic-v1", "text": "## For Parachain Integrators\n\nEnhance cross-chain interoperability and expand your parachain’s functionality:\n\n- **[Register your parachain's asset on Asset Hub](/tutorials/polkadot-sdk/system-chains/asset-hub/register-foreign-asset/)**: Connect your parachain’s assets to Asset Hub as a foreign asset using XCM, enabling seamless cross-chain transfers and integration."} {"page_id": "tutorials-polkadot-sdk-system-chains", "page_title": "System Chains Tutorials", "index": 1, "depth": 2, "title": "For Developers Leveraging System Chains", "anchor": "for-developers-leveraging-system-chains", "start_char": 990, "end_char": 1551, "estimated_token_count": 134, "token_estimator": "heuristic-v1", "text": "## For Developers Leveraging System Chains\n\nUnlock new possibilities by tapping into Polkadot’s system chains:\n\n- **[Register a new asset on Asset Hub](/tutorials/polkadot-sdk/system-chains/asset-hub/register-local-asset/)**: Create and customize assets directly on Asset Hub (local assets) with parameters like metadata, minimum balances, and more.\n\n- **[Convert Assets](/tutorials/polkadot-sdk/system-chains/asset-hub/asset-conversion/)**: Use Asset Hub's AMM functionality to swap between different assets, provide liquidity to pools, and manage LP tokens."} {"page_id": "tutorials-polkadot-sdk-system-chains", "page_title": "System Chains Tutorials", "index": 2, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1551, "end_char": 1600, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} @@ -1459,7 +1459,7 @@ {"page_id": "tutorials-polkadot-sdk-testing", "page_title": "Blockchain Testing Tutorials", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 794, "end_char": 843, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "tutorials-polkadot-sdk", "page_title": "Polkadot SDK Tutorials", "index": 0, "depth": 2, "title": "Build and Deploy a Parachain", "anchor": "build-and-deploy-a-parachain", "start_char": 450, "end_char": 1038, "estimated_token_count": 133, "token_estimator": "heuristic-v1", "text": "## Build and Deploy a Parachain\n\nFollow these key milestones to guide you through parachain development. Each step links to detailed tutorials for a deeper dive into each stage:\n\n- **[Install the Polkadot SDK](/develop/parachains/install-polkadot-sdk/)**: Set up the necessary tools to begin building on Polkadot. This step will get your environment ready for parachain development.\n\n- **[Parachains Zero to Hero](/tutorials/polkadot-sdk/parachains/zero-to-hero/)**: A series of step-by-step guides to building, testing, and deploying custom pallets and runtimes using the Polkadot SDK."} {"page_id": "tutorials-polkadot-sdk", "page_title": "Polkadot SDK Tutorials", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1038, "end_char": 1088, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} -{"page_id": "tutorials-polkadot-sdk", "page_title": "Polkadot SDK Tutorials", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1088, "end_char": 1489, "estimated_token_count": 113, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} +{"page_id": "tutorials-polkadot-sdk", "page_title": "Polkadot SDK Tutorials", "index": 2, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 1088, "end_char": 1478, "estimated_token_count": 110, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-smart-contracts-demo-aplications-deploying-uniswap-v2", "page_title": "Deploying Uniswap V2 on Polkadot", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 191, "end_char": 857, "estimated_token_count": 131, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nDecentralized exchanges (DEXs) are a cornerstone of the DeFi ecosystem, allowing for permissionless token swaps without intermediaries. [Uniswap V2](https://docs.uniswap.org/contracts/v2/overview){target=\\_blank}, with its Automated Market Maker (AMM) model, revolutionized DEXs by enabling liquidity provision for any ERC-20 token pair.\n\nThis tutorial will guide you through how Uniswap V2 works so you can take advantage of it in your projects deployed to Polkadot Hub. By understanding these contracts, you'll gain hands-on experience with one of the most influential DeFi protocols and understand how it functions across blockchain ecosystems."} {"page_id": "tutorials-smart-contracts-demo-aplications-deploying-uniswap-v2", "page_title": "Deploying Uniswap V2 on Polkadot", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 857, "end_char": 1352, "estimated_token_count": 121, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore starting, make sure you have:\n\n- Node.js (v16.0.0 or later) and npm installed.\n- Basic understanding of Solidity and JavaScript.\n- Familiarity with [`hardhat-polkadot`](/develop/smart-contracts/dev-environments/hardhat){target=\\_blank} development environment.\n- Some PAS test tokens to cover transaction fees (obtained from the [Polkadot faucet](https://faucet.polkadot.io/?parachain=1111){target=\\_blank}).\n- Basic understanding of how AMMs and liquidity pools work."} {"page_id": "tutorials-smart-contracts-demo-aplications-deploying-uniswap-v2", "page_title": "Deploying Uniswap V2 on Polkadot", "index": 2, "depth": 2, "title": "Set Up the Project", "anchor": "set-up-the-project", "start_char": 1352, "end_char": 3690, "estimated_token_count": 572, "token_estimator": "heuristic-v1", "text": "## Set Up the Project\n\nLet's start by cloning the Uniswap V2 project:\n\n1. Clone the Uniswap V2 repository:\n\n ```\n git clone https://github.com/polkadot-developers/polkavm-hardhat-examples.git -b v0.0.6\n cd polkavm-hardhat-examples/uniswap-v2-polkadot/\n ```\n\n2. Install the required dependencies:\n\n ```bash\n npm install\n ```\n\n3. Update the `hardhat.config.js` file so the paths for the Substrate node and the ETH-RPC adapter match with the paths on your machine. For more info, check the [Testing your Contract](/develop/smart-contracts/dev-environments/hardhat/#testing-your-contract){target=\\_blank} section in the Hardhat guide.\n\n ```js title=\"hardhat.config.js\"\n hardhat: {\n polkavm: true,\n nodeConfig: {\n nodeBinaryPath: '../bin/substrate-node',\n rpcPort: 8000,\n dev: true,\n },\n adapterConfig: {\n adapterBinaryPath: '../bin/eth-rpc',\n dev: true,\n },\n },\n ```\n\n4. Create a `.env` file in your project root to store your private keys (you can use as an example the `env.example` file):\n\n ```text title=\".env\"\n LOCAL_PRIV_KEY=\"INSERT_LOCAL_PRIVATE_KEY\"\n AH_PRIV_KEY=\"INSERT_AH_PRIVATE_KEY\"\n ```\n\n Ensure to replace `\"INSERT_LOCAL_PRIVATE_KEY\"` with a private key available in the local environment (you can get them from this [file](https://github.com/paritytech/hardhat-polkadot/blob/main/packages/hardhat-polkadot-node/src/constants.ts#L22){target=\\_blank}). And `\"INSERT_AH_PRIVATE_KEY\"` with the account's private key you want to use to deploy the contracts. You can get this by exporting the private key from your wallet (e.g., MetaMask).\n\n !!!warning\n Keep your private key safe, and never share it with anyone. If it is compromised, your funds can be stolen.\n\n5. Compile the contracts:\n\n ```bash\n npx hardhat compile\n ```\n\nIf the compilation is successful, you should see the following output:\n\n
\n npx hardhat compile\n Compiling 12 Solidity files\n Successfully compiled 12 Solidity files\n
\n\nAfter running the above command, you should see the compiled contracts in the `artifacts-pvm` directory. This directory contains the ABI and bytecode of your contracts."} @@ -1483,45 +1483,45 @@ {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 197, "end_char": 815, "estimated_token_count": 123, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nCreating [smart contracts](/develop/smart-contracts/overview/){target=\\_blank} is fundamental to blockchain development. While many frameworks and tools are available, understanding how to write a contract from scratch with just a text editor is essential knowledge.\n\nThis tutorial will guide you through creating a basic smart contract that can be used with other tutorials for deployment and integration on Polkadot Hub. To understand how smart contracts work in Polkadot Hub, check the [Smart Contract Basics](/polkadot-protocol/smart-contract-basics/){target=\\_blank} guide for more information."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 815, "end_char": 1267, "estimated_token_count": 119, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore starting, make sure you have:\n\n- A text editor of your choice ([VS Code](https://code.visualstudio.com/){target=\\_blank}, [Sublime Text](https://www.sublimetext.com/){target=\\_blank}, etc.).\n- Basic understanding of programming concepts.\n- Familiarity with the Solidity programming language syntax. For further references, check the official [Solidity documentation](https://docs.soliditylang.org/en/latest/){target=\\_blank}."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 2, "depth": 2, "title": "Understanding Smart Contract Structure", "anchor": "understanding-smart-contract-structure", "start_char": 1267, "end_char": 2249, "estimated_token_count": 216, "token_estimator": "heuristic-v1", "text": "## Understanding Smart Contract Structure\n\nLet's explore these components before building the contract:\n\n- **[SPDX license identifier](https://docs.soliditylang.org/en/v0.6.8/layout-of-source-files.html){target=\\_blank}**: A standardized way to declare the license under which your code is released. This helps with legal compliance and is required by the Solidity compiler to avoid warnings.\n- **Pragma directive**: Specifies which version of Solidity compiler should be used for your contract.\n- **Contract declaration**: Similar to a class in object-oriented programming, it defines the boundaries of your smart contract.\n- **State variables**: Data stored directly in the contract that persists between function calls. These represent the contract's \"state\" on the blockchain.\n- **Functions**: Executable code that can read or modify the contract's state variables.\n- **Events**: Notification mechanisms that applications can subscribe to in order to track blockchain changes."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 3, "depth": 2, "title": "Create the Smart Contract", "anchor": "create-the-smart-contract", "start_char": 2249, "end_char": 5735, "estimated_token_count": 680, "token_estimator": "heuristic-v1", "text": "## Create the Smart Contract\n\nIn this section, you'll build a simple storage contract step by step. This basic Storage contract is a great starting point for beginners. It introduces key concepts like state variables, functions, and events in a simple way, demonstrating how data is stored and updated on the blockchain. Later, you'll explore each component in more detail to understand what's happening behind the scenes.\n\nThis contract will:\n\n- Store a number.\n- Allow updating the stored number.\n- Emit an event when the number changes.\n\nTo build the smart contract, follow the steps below:\n\n1. Create a new file named `Storage.sol`.\n\n2. Add the SPDX license identifier at the top of the file:\n\n ```solidity\n // SPDX-License-Identifier: MIT\n ```\n\n This line tells users and tools which license governs your code. The [MIT license](https://opensource.org/license/mit){target=\\_blank} is commonly used for open-source projects. The Solidity compiler requires this line to avoid licensing-related warnings.\n\n3. Specify the Solidity version:\n\n ```solidity\n pragma solidity ^0.8.28;\n ```\n\n The caret `^` means \"this version or any compatible newer version.\" This helps ensure your contract compiles correctly with the intended compiler features.\n\n4. Create the contract structure:\n\n ```solidity\n contract Storage {\n // Contract code will go here\n }\n ```\n\n This defines a contract named \"Storage\", similar to how you would define a class in other programming languages.\n\n5. Add the state variables and event:\n\n ```solidity\n contract Storage {\n // State variable to store a number\n uint256 private number;\n \n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n }\n ```\n\n Here, you're defining:\n\n - A state variable named `number` of type `uint256` (unsigned integer with 256 bits), which is marked as `private` so it can only be accessed via functions within this contract.\n - An event named `NumberChanged` that will be triggered whenever the number changes. The event includes the new value as data.\n\n6. Add the getter and setter functions:\n\n ```solidity\n // SPDX-License-Identifier: MIT\n pragma solidity ^0.8.28;\n\n contract Storage {\n // State variable to store our number\n uint256 private number;\n\n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n\n // Function to store a new number\n function store(uint256 newNumber) public {\n number = newNumber;\n emit NumberChanged(newNumber);\n }\n\n // Function to retrieve the stored number\n function retrieve() public view returns (uint256) {\n return number;\n }\n }\n ```\n\n??? code \"Complete Storage.sol contract\"\n\n ```solidity title=\"Storage.sol\"\n // SPDX-License-Identifier: MIT\n pragma solidity ^0.8.28;\n\n contract Storage {\n // State variable to store our number\n uint256 private number;\n\n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n\n // Function to store a new number\n function store(uint256 newNumber) public {\n number = newNumber;\n emit NumberChanged(newNumber);\n }\n\n // Function to retrieve the stored number\n function retrieve() public view returns (uint256) {\n return number;\n }\n }\n ```"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 4, "depth": 2, "title": "Understanding the Code", "anchor": "understanding-the-code", "start_char": 5735, "end_char": 8302, "estimated_token_count": 524, "token_estimator": "heuristic-v1", "text": "## Understanding the Code\n\nLet's break down the key components of the contract:\n\n- **State Variable**\n\n - **`uint256 private number`**: A private variable that can only be accessed through the contract's functions.\n - The `private` keyword prevents direct access from other contracts, but it's important to note that while other contracts cannot read this variable directly, the data itself is still visible on the blockchain and can be read by external tools or applications that interact with the blockchain. \"Private\" in Solidity doesn't mean the data is encrypted or truly hidden.\n - State variables in Solidity are permanent storage on the blockchain, making them different from variables in traditional programming. Every change to a state variable requires a transaction and costs gas (the fee paid for blockchain operations).\n\n- **Event**\n\n - **`event NumberChanged(uint256 newNumber)`**: Emitted when the stored number changes.\n - When triggered, events write data to the blockchain's log, which can be efficiently queried by applications.\n - Unlike state variables, events cannot be read by smart contracts, only by external applications.\n - Events are much more gas-efficient than storing data when you only need to notify external systems of changes.\n\n- **Functions**\n\n - **`store(uint256 newNumber)`**: Updates the stored number and emits an event.\n - This function changes the state of the contract and requires a transaction to execute.\n - The `emit` keyword is used to trigger the defined event.\n\n - **`retrieve()`**: Returns the current stored number.\n - The `view` keyword indicates that this function only reads data and doesn't modify the contract's state.\n - View functions don't require a transaction and don't cost gas when called externally.\n\n For those new to Solidity, this naming pattern (getter/setter functions) is a common design pattern. Instead of directly accessing state variables, the convention is to use functions to control access and add additional logic if needed.\n\nThis basic contract serves as a foundation for learning smart contract development. Real-world contracts often require additional security considerations, more complex logic, and thorough testing before deployment.\n\nFor more detailed information about Solidity types, functions, and best practices, refer to the [Solidity documentation](https://docs.soliditylang.org/en/latest/){target=\\_blank} or this [beginner's guide to Solidity](https://www.tutorialspoint.com/solidity/index.htm){target=\\_blank}."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 5, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 8302, "end_char": 8670, "estimated_token_count": 98, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n\n
\n\n- Tutorial __Test and Deploy with Hardhat__\n\n ---\n\n Learn how to test and deploy the smart contract you created by using Hardhat.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/smart-contracts/launch-your-first-project/test-and-deploy-with-hardhat/)\n\n
"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 3, "depth": 2, "title": "Create the Smart Contract", "anchor": "create-the-smart-contract", "start_char": 2249, "end_char": 4545, "estimated_token_count": 480, "token_estimator": "heuristic-v1", "text": "## Create the Smart Contract\n\nIn this section, you'll build a simple storage contract step by step. This basic Storage contract is a great starting point for beginners. It introduces key concepts like state variables, functions, and events in a simple way, demonstrating how data is stored and updated on the blockchain. Later, you'll explore each component in more detail to understand what's happening behind the scenes.\n\nThis contract will:\n\n- Store a number.\n- Allow updating the stored number.\n- Emit an event when the number changes.\n\nTo build the smart contract, follow the steps below:\n\n1. Create a new file named `Storage.sol`.\n\n2. Add the SPDX license identifier at the top of the file:\n\n ```solidity\n // SPDX-License-Identifier: MIT\n ```\n\n This line tells users and tools which license governs your code. The [MIT license](https://opensource.org/license/mit){target=\\_blank} is commonly used for open-source projects. The Solidity compiler requires this line to avoid licensing-related warnings.\n\n3. Specify the Solidity version:\n\n ```solidity\n pragma solidity ^0.8.28;\n ```\n\n The caret `^` means \"this version or any compatible newer version.\" This helps ensure your contract compiles correctly with the intended compiler features.\n\n4. Create the contract structure:\n\n ```solidity\n contract Storage {\n // Contract code will go here\n }\n ```\n\n This defines a contract named \"Storage\", similar to how you would define a class in other programming languages.\n\n5. Add the state variables and event:\n\n ```solidity\n contract Storage {\n // State variable to store a number\n uint256 private number;\n \n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n }\n ```\n\n Here, you're defining:\n\n - A state variable named `number` of type `uint256` (unsigned integer with 256 bits), which is marked as `private` so it can only be accessed via functions within this contract.\n - An event named `NumberChanged` that will be triggered whenever the number changes. The event includes the new value as data.\n\n6. Add the getter and setter functions:\n\n ```solidity\n \n ```\n\n??? code \"Complete Storage.sol contract\"\n\n ```solidity title=\"Storage.sol\"\n \n ```"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 4, "depth": 2, "title": "Understanding the Code", "anchor": "understanding-the-code", "start_char": 4545, "end_char": 7112, "estimated_token_count": 524, "token_estimator": "heuristic-v1", "text": "## Understanding the Code\n\nLet's break down the key components of the contract:\n\n- **State Variable**\n\n - **`uint256 private number`**: A private variable that can only be accessed through the contract's functions.\n - The `private` keyword prevents direct access from other contracts, but it's important to note that while other contracts cannot read this variable directly, the data itself is still visible on the blockchain and can be read by external tools or applications that interact with the blockchain. \"Private\" in Solidity doesn't mean the data is encrypted or truly hidden.\n - State variables in Solidity are permanent storage on the blockchain, making them different from variables in traditional programming. Every change to a state variable requires a transaction and costs gas (the fee paid for blockchain operations).\n\n- **Event**\n\n - **`event NumberChanged(uint256 newNumber)`**: Emitted when the stored number changes.\n - When triggered, events write data to the blockchain's log, which can be efficiently queried by applications.\n - Unlike state variables, events cannot be read by smart contracts, only by external applications.\n - Events are much more gas-efficient than storing data when you only need to notify external systems of changes.\n\n- **Functions**\n\n - **`store(uint256 newNumber)`**: Updates the stored number and emits an event.\n - This function changes the state of the contract and requires a transaction to execute.\n - The `emit` keyword is used to trigger the defined event.\n\n - **`retrieve()`**: Returns the current stored number.\n - The `view` keyword indicates that this function only reads data and doesn't modify the contract's state.\n - View functions don't require a transaction and don't cost gas when called externally.\n\n For those new to Solidity, this naming pattern (getter/setter functions) is a common design pattern. Instead of directly accessing state variables, the convention is to use functions to control access and add additional logic if needed.\n\nThis basic contract serves as a foundation for learning smart contract development. Real-world contracts often require additional security considerations, more complex logic, and thorough testing before deployment.\n\nFor more detailed information about Solidity types, functions, and best practices, refer to the [Solidity documentation](https://docs.soliditylang.org/en/latest/){target=\\_blank} or this [beginner's guide to Solidity](https://www.tutorialspoint.com/solidity/index.htm){target=\\_blank}."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 5, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 7112, "end_char": 7480, "estimated_token_count": 98, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n\n
\n\n- Tutorial __Test and Deploy with Hardhat__\n\n ---\n\n Learn how to test and deploy the smart contract you created by using Hardhat.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/smart-contracts/launch-your-first-project/test-and-deploy-with-hardhat/)\n\n
"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 202, "end_char": 1019, "estimated_token_count": 167, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nDecentralized applications (dApps) have become a cornerstone of the Web3 ecosystem, allowing developers to create applications that interact directly with blockchain networks. Polkadot Hub, a blockchain that supports smart contract functionality, provides an excellent platform for deploying and interacting with dApps.\n\nIn this tutorial, you'll build a complete dApp that interacts with a smart contract deployed on the Polkadot Hub TestNet. It will use [Ethers.js](/develop/smart-contracts/libraries/ethers-js){target=\\_blank} to interact with the blockchain and [Next.js](https://nextjs.org/){target=\\_blank} as the frontend framework. By the end of this tutorial, you'll have a functional dApp that allows users to connect their wallets, read data from the blockchain, and execute transactions."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 1019, "end_char": 1479, "estimated_token_count": 111, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore you begin, make sure you have:\n\n- [Node.js](https://nodejs.org/en){target=\\_blank} v16 or newer installed on your machine.\n- A crypto wallet (like MetaMask) with some test tokens. For further information, check the [Connect to Polkadot](/develop/smart-contracts/connect-to-polkadot){target=\\_blank} guide.\n- Basic understanding of React and JavaScript.\n- Familiarity with blockchain concepts and Solidity (helpful but not mandatory)."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 2, "depth": 2, "title": "Project Overview", "anchor": "project-overview", "start_char": 1479, "end_char": 2676, "estimated_token_count": 301, "token_estimator": "heuristic-v1", "text": "## Project Overview\n\nThe dApp will interact with a simple Storage contract. For a step-by-step guide on creating it, refer to the [Create Contracts](/tutorials/smart-contracts/launch-your-first-project/create-contracts){target=\\_blank} tutorial. This contract allows:\n\n- Reading a stored number from the blockchain.\n- Updating the stored number with a new value.\n\nThe contract has already been deployed to the Polkadot Hub TestNet for testing purposes: `0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f`. If you want to deploy your own, follow the [Deploying Contracts](/develop/smart-contracts/dev-environments/remix/#deploying-contracts){target=\\_blank} section.\n\nHere's a simplified view of what you'll be building:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-1.webp)\n\nThe general structure of the project should end up as follows:\n\n```bash\nethers-dapp\n├── abis\n│ └── Storage.json\n└── app\n ├── components\n │ ├── ReadContract.js\n │ ├── WalletConnect.js\n │ └── WriteContract.js\n ├── favicon.ico\n ├── globals.css\n ├── layout.js\n ├── page.js\n └── utils\n ├── contract.js\n └── ethers.js\n```"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 3, "depth": 2, "title": "Set Up the Project", "anchor": "set-up-the-project", "start_char": 2676, "end_char": 2923, "estimated_token_count": 77, "token_estimator": "heuristic-v1", "text": "## Set Up the Project\n\nLet's start by creating a new Next.js project:\n\n```bash\nnpx create-next-app ethers-dapp --js --eslint --tailwind --app --yes\ncd ethers-dapp\n```\n\nNext, install the needed dependencies:\n\n```bash\nnpm install ethers@6.13.5\n```"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 4, "depth": 2, "title": "Connect to Polkadot Hub", "anchor": "connect-to-polkadot-hub", "start_char": 2923, "end_char": 4631, "estimated_token_count": 417, "token_estimator": "heuristic-v1", "text": "## Connect to Polkadot Hub\n\nTo interact with the Polkadot Hub, you need to set up an [Ethers.js Provider](/develop/smart-contracts/libraries/ethers-js/#set-up-the-ethersjs-provider){target=\\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/ethers.js` and add the following code:\n\n```javascript title=\"app/utils/ethers.js\"\nimport { JsonRpcProvider } from 'ethers';\n\nexport const PASSET_HUB_CONFIG = {\n name: 'Passet Hub',\n rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io/', // Passet Hub testnet RPC\n chainId: 420420422, // Passet Hub testnet chainId\n blockExplorer: 'https://blockscout-passet-hub.parity-testnet.parity.io/',\n};\n\nexport const getProvider = () => {\n return new JsonRpcProvider(PASSET_HUB_CONFIG.rpc, {\n chainId: PASSET_HUB_CONFIG.chainId,\n name: PASSET_HUB_CONFIG.name,\n });\n};\n\n// Helper to get a signer from a provider\nexport const getSigner = async (provider) => {\n if (window.ethereum) {\n await window.ethereum.request({ method: 'eth_requestAccounts' });\n const ethersProvider = new ethers.BrowserProvider(window.ethereum);\n return ethersProvider.getSigner();\n }\n throw new Error('No Ethereum browser provider detected');\n};\n```\n\nThis file establishes a connection to the Polkadot Hub TestNet and provides helper functions for obtaining a [Provider](https://docs.ethers.org/v5/api/providers/provider/){target=_blank} and [Signer](https://docs.ethers.org/v5/api/signer/){target=_blank}. The provider allows you to read data from the blockchain, while the signer enables users to send transactions and modify the blockchain state."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 5, "depth": 2, "title": "Set Up the Smart Contract Interface", "anchor": "set-up-the-smart-contract-interface", "start_char": 4631, "end_char": 6548, "estimated_token_count": 403, "token_estimator": "heuristic-v1", "text": "## Set Up the Smart Contract Interface\n\nFor this dApp, you'll use a simple Storage contract already deployed. So, you need to create an interface to interact with it. First, ensure to create a folder called `abis` at the root of your project, create a file `Storage.json`, and paste the corresponding ABI (Application Binary Interface) of the Storage contract. You can copy and paste the following:\n\n???+ code \"Storage.sol ABI\"\n\n ```json title=\"abis/Storage.json\"\n [\n {\n \"inputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"_newNumber\",\n \"type\": \"uint256\"\n }\n ],\n \"name\": \"setNumber\",\n \"outputs\": [],\n \"stateMutability\": \"nonpayable\",\n \"type\": \"function\"\n },\n {\n \"inputs\": [],\n \"name\": \"storedNumber\",\n \"outputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"\",\n \"type\": \"uint256\"\n }\n ],\n \"stateMutability\": \"view\",\n \"type\": \"function\"\n }\n ]\n ```\n\nNow, create a file called `app/utils/contract.js`:\n\n```javascript title=\"app/utils/contract.js\"\nimport { Contract } from 'ethers';\nimport { getProvider } from './ethers';\nimport StorageABI from '../../abis/Storage.json';\n\nexport const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f';\n\nexport const CONTRACT_ABI = StorageABI;\n\nexport const getContract = () => {\n const provider = getProvider();\n return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, provider);\n};\n\nexport const getSignedContract = async (signer) => {\n return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, signer);\n};\n```\n\nThis file defines the contract address, ABI, and functions to create instances of the contract for reading and writing."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 6, "depth": 2, "title": "Create the Wallet Connection Component", "anchor": "create-the-wallet-connection-component", "start_char": 6548, "end_char": 12876, "estimated_token_count": 1445, "token_estimator": "heuristic-v1", "text": "## Create the Wallet Connection Component\n\nNext, let's create a component to handle wallet connections. Create a new file called `app/components/WalletConnect.js`:\n\n```javascript title=\"app/components/WalletConnect.js\"\n'use client';\n\nimport React, { useState, useEffect } from 'react';\nimport { PASSET_HUB_CONFIG } from '../utils/ethers';\n\nconst WalletConnect = ({ onConnect }) => {\n const [account, setAccount] = useState(null);\n const [chainId, setChainId] = useState(null);\n const [error, setError] = useState(null);\n\n useEffect(() => {\n // Check if user already has an authorized wallet connection\n const checkConnection = async () => {\n if (window.ethereum) {\n try {\n // eth_accounts doesn't trigger the wallet popup\n const accounts = await window.ethereum.request({\n method: 'eth_accounts',\n });\n if (accounts.length > 0) {\n setAccount(accounts[0]);\n const chainIdHex = await window.ethereum.request({\n method: 'eth_chainId',\n });\n setChainId(parseInt(chainIdHex, 16));\n }\n } catch (err) {\n console.error('Error checking connection:', err);\n setError('Failed to check wallet connection');\n }\n }\n };\n\n checkConnection();\n\n if (window.ethereum) {\n // Setup wallet event listeners\n window.ethereum.on('accountsChanged', (accounts) => {\n setAccount(accounts[0] || null);\n if (accounts[0] && onConnect) onConnect(accounts[0]);\n });\n\n window.ethereum.on('chainChanged', (chainIdHex) => {\n setChainId(parseInt(chainIdHex, 16));\n });\n }\n\n return () => {\n // Cleanup event listeners\n if (window.ethereum) {\n window.ethereum.removeListener('accountsChanged', () => {});\n window.ethereum.removeListener('chainChanged', () => {});\n }\n };\n }, [onConnect]);\n\n const connectWallet = async () => {\n if (!window.ethereum) {\n setError(\n 'MetaMask not detected! Please install MetaMask to use this dApp.'\n );\n return;\n }\n\n try {\n // eth_requestAccounts triggers the wallet popup\n const accounts = await window.ethereum.request({\n method: 'eth_requestAccounts',\n });\n setAccount(accounts[0]);\n\n const chainIdHex = await window.ethereum.request({\n method: 'eth_chainId',\n });\n const currentChainId = parseInt(chainIdHex, 16);\n setChainId(currentChainId);\n\n // Prompt user to switch networks if needed\n if (currentChainId !== PASSET_HUB_CONFIG.chainId) {\n await switchNetwork();\n }\n\n if (onConnect) onConnect(accounts[0]);\n } catch (err) {\n console.error('Error connecting to wallet:', err);\n setError('Failed to connect wallet');\n }\n };\n\n const switchNetwork = async () => {\n try {\n await window.ethereum.request({\n method: 'wallet_switchEthereumChain',\n params: [{ chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}` }],\n });\n } catch (switchError) {\n // Error 4902 means the chain hasn't been added to MetaMask\n if (switchError.code === 4902) {\n try {\n await window.ethereum.request({\n method: 'wallet_addEthereumChain',\n params: [\n {\n chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}`,\n chainName: PASSET_HUB_CONFIG.name,\n rpcUrls: [PASSET_HUB_CONFIG.rpc],\n blockExplorerUrls: [PASSET_HUB_CONFIG.blockExplorer],\n },\n ],\n });\n } catch (addError) {\n setError('Failed to add network to wallet');\n }\n } else {\n setError('Failed to switch network');\n }\n }\n };\n\n // UI-only disconnection - MetaMask doesn't support programmatic disconnection\n const disconnectWallet = () => {\n setAccount(null);\n };\n\n return (\n
\n {error &&

{error}

}\n\n {!account ? (\n \n Connect Wallet\n \n ) : (\n
\n \n {`${account.substring(0, 6)}...${account.substring(38)}`}\n \n \n Disconnect\n \n {chainId !== PASSET_HUB_CONFIG.chainId && (\n \n Switch to Passet Hub\n \n )}\n
\n )}\n
\n );\n};\n\nexport default WalletConnect;\n```\n\nThis component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. \n\nTo integrate this component to your dApp, you need to overwrite the existing boilerplate in `app/page.js` with the following code:\n\n```javascript title=\"app/page.js\"\n\nimport { useState } from 'react';\n\nimport WalletConnect from './components/WalletConnect';\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Ethers.js dApp - Passet Hub Smart Contracts\n

\n \n
\n );\n}\n```\n\nIn your terminal, you can launch your project by running:\n\n```bash\nnpm run dev\n```\n\nAnd you will see the following:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-2.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 7, "depth": 2, "title": "Read Data from the Blockchain", "anchor": "read-data-from-the-blockchain", "start_char": 12876, "end_char": 16092, "estimated_token_count": 805, "token_estimator": "heuristic-v1", "text": "## Read Data from the Blockchain\n\nNow, let's create a component to read data from the contract. Create a file called `app/components/ReadContract.js`:\n\n```javascript title=\"app/components/ReadContract.js\"\n'use client';\n\nimport React, { useState, useEffect } from 'react';\nimport { getContract } from '../utils/contract';\n\nconst ReadContract = () => {\n const [storedNumber, setStoredNumber] = useState(null);\n const [loading, setLoading] = useState(true);\n const [error, setError] = useState(null);\n\n useEffect(() => {\n // Function to read data from the blockchain\n const fetchData = async () => {\n try {\n setLoading(true);\n const contract = getContract();\n // Call the smart contract's storedNumber function\n const number = await contract.storedNumber();\n setStoredNumber(number.toString());\n setError(null);\n } catch (err) {\n console.error('Error fetching stored number:', err);\n setError('Failed to fetch data from the contract');\n } finally {\n setLoading(false);\n }\n };\n\n fetchData();\n\n // Poll for updates every 10 seconds to keep UI in sync with blockchain\n const interval = setInterval(fetchData, 10000);\n\n // Clean up interval on component unmount\n return () => clearInterval(interval);\n }, []);\n\n return (\n
\n

Contract Data

\n {loading ? (\n
\n
\n
\n ) : error ? (\n

{error}

\n ) : (\n
\n

\n Stored Number: {storedNumber}\n

\n
\n )}\n
\n );\n};\n\nexport default ReadContract;\n```\n\nThis component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically.\n\nTo see this change in your dApp, you need to integrate this component into the `app/page.js` file:\n\n```javascript title=\"app/page.js\"\n\nimport { useState } from 'react';\n\nimport WalletConnect from './components/WalletConnect';\nimport ReadContract from './components/ReadContract';\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Ethers.js dApp - Passet Hub Smart Contracts\n

\n \n \n
\n );\n}\n```\n\nYour dApp will automatically be updated to the following:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-3.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 8, "depth": 2, "title": "Write Data to the Blockchain", "anchor": "write-data-to-the-blockchain", "start_char": 16092, "end_char": 21164, "estimated_token_count": 1229, "token_estimator": "heuristic-v1", "text": "## Write Data to the Blockchain\n\nFinally, let's create a component that allows users to update the stored number. Create a file called `app/components/WriteContract.js`:\n\n```javascript title=\"app/components/WriteContract.js\"\n'use client';\n\nimport { useState } from 'react';\nimport { getSignedContract } from '../utils/contract';\nimport { ethers } from 'ethers';\n\nconst WriteContract = ({ account }) => {\n const [newNumber, setNewNumber] = useState('');\n const [status, setStatus] = useState({ type: null, message: '' });\n const [isSubmitting, setIsSubmitting] = useState(false);\n\n const handleSubmit = async (e) => {\n e.preventDefault();\n\n // Validation checks\n if (!account) {\n setStatus({ type: 'error', message: 'Please connect your wallet first' });\n return;\n }\n\n if (!newNumber || isNaN(Number(newNumber))) {\n setStatus({ type: 'error', message: 'Please enter a valid number' });\n return;\n }\n\n try {\n setIsSubmitting(true);\n setStatus({ type: 'info', message: 'Initiating transaction...' });\n\n // Get a signer from the connected wallet\n const provider = new ethers.BrowserProvider(window.ethereum);\n const signer = await provider.getSigner();\n const contract = await getSignedContract(signer);\n\n // Send transaction to blockchain and wait for user confirmation in wallet\n setStatus({\n type: 'info',\n message: 'Please confirm the transaction in your wallet...',\n });\n\n // Call the contract's setNumber function\n const tx = await contract.setNumber(newNumber);\n\n // Wait for transaction to be mined\n setStatus({\n type: 'info',\n message: 'Transaction submitted. Waiting for confirmation...',\n });\n const receipt = await tx.wait();\n\n setStatus({\n type: 'success',\n message: `Transaction confirmed! Transaction hash: ${receipt.hash}`,\n });\n setNewNumber('');\n } catch (err) {\n console.error('Error updating number:', err);\n\n // Error code 4001 is MetaMask's code for user rejection\n if (err.code === 4001) {\n setStatus({ type: 'error', message: 'Transaction rejected by user.' });\n } else {\n setStatus({\n type: 'error',\n message: `Error: ${err.message || 'Failed to send transaction'}`,\n });\n }\n } finally {\n setIsSubmitting(false);\n }\n };\n\n return (\n
\n

Update Stored Number

\n {status.message && (\n \n {status.message}\n
\n )}\n
\n setNewNumber(e.target.value)}\n disabled={isSubmitting || !account}\n className=\"w-full p-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-pink-400\"\n />\n \n {isSubmitting ? 'Updating...' : 'Update'}\n \n \n {!account && (\n

\n Connect your wallet to update the stored number.\n

\n )}\n
\n );\n};\n\nexport default WriteContract;\n```\n\nThis component allows users to input a new number and send a transaction to update the value stored in the contract. When the transaction is successful, users will see the stored value update in the `ReadContract` component after the transaction is confirmed.\n\nUpdate the `app/page.js` file to integrate all components:\n\n```javascript title=\"app/page.js\"\n'use client';\n\nimport { useState } from 'react';\n\nimport WalletConnect from './components/WalletConnect';\nimport ReadContract from './components/ReadContract';\nimport WriteContract from './components/WriteContract';\n\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Ethers.js dApp - Passet Hub Smart Contracts\n

\n \n \n \n
\n );\n}\n```\n\nThe completed UI will display:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-4.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 9, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 21164, "end_char": 21978, "estimated_token_count": 171, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've built a complete dApp that interacts with a smart contract on the Polkadot Hub TestNet using Ethers.js and Next.js. Your application can now:\n\n- Connect to a user's wallet.\n- Read data from a smart contract.\n- Send transactions to update the contract state.\n\nThese fundamental skills provide the foundation for building more complex dApps on Polkadot Hub. With these building blocks, you can extend your application to interact with more sophisticated smart contracts and create more advanced user interfaces.\n\nTo get started right away with a working example, you can clone the repository and navigate to the implementation:\n\n```\ngit clone https://github.com/polkadot-developers/polkavm-storage-contract-dapps.git -b v0.0.2\ncd polkavm-storage-contract-dapps/ethers-dapp\n```"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 4, "depth": 2, "title": "Connect to Polkadot Hub", "anchor": "connect-to-polkadot-hub", "start_char": 2923, "end_char": 3781, "estimated_token_count": 217, "token_estimator": "heuristic-v1", "text": "## Connect to Polkadot Hub\n\nTo interact with the Polkadot Hub, you need to set up an [Ethers.js Provider](/develop/smart-contracts/libraries/ethers-js/#set-up-the-ethersjs-provider){target=\\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/ethers.js` and add the following code:\n\n```javascript title=\"app/utils/ethers.js\"\n\n```\n\nThis file establishes a connection to the Polkadot Hub TestNet and provides helper functions for obtaining a [Provider](https://docs.ethers.org/v5/api/providers/provider/){target=_blank} and [Signer](https://docs.ethers.org/v5/api/signer/){target=_blank}. The provider allows you to read data from the blockchain, while the signer enables users to send transactions and modify the blockchain state."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 5, "depth": 2, "title": "Set Up the Smart Contract Interface", "anchor": "set-up-the-smart-contract-interface", "start_char": 3781, "end_char": 4485, "estimated_token_count": 169, "token_estimator": "heuristic-v1", "text": "## Set Up the Smart Contract Interface\n\nFor this dApp, you'll use a simple Storage contract already deployed. So, you need to create an interface to interact with it. First, ensure to create a folder called `abis` at the root of your project, create a file `Storage.json`, and paste the corresponding ABI (Application Binary Interface) of the Storage contract. You can copy and paste the following:\n\n???+ code \"Storage.sol ABI\"\n\n ```json title=\"abis/Storage.json\"\n \n ```\n\nNow, create a file called `app/utils/contract.js`:\n\n```javascript title=\"app/utils/contract.js\"\n\n```\n\nThis file defines the contract address, ABI, and functions to create instances of the contract for reading and writing."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 6, "depth": 2, "title": "Create the Wallet Connection Component", "anchor": "create-the-wallet-connection-component", "start_char": 4485, "end_char": 5247, "estimated_token_count": 188, "token_estimator": "heuristic-v1", "text": "## Create the Wallet Connection Component\n\nNext, let's create a component to handle wallet connections. Create a new file called `app/components/WalletConnect.js`:\n\n```javascript title=\"app/components/WalletConnect.js\"\n\n```\n\nThis component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. \n\nTo integrate this component to your dApp, you need to overwrite the existing boilerplate in `app/page.js` with the following code:\n\n```javascript title=\"app/page.js\"\n\n\n\n\n```\n\nIn your terminal, you can launch your project by running:\n\n```bash\nnpm run dev\n```\n\nAnd you will see the following:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-2.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 7, "depth": 2, "title": "Read Data from the Blockchain", "anchor": "read-data-from-the-blockchain", "start_char": 5247, "end_char": 5940, "estimated_token_count": 177, "token_estimator": "heuristic-v1", "text": "## Read Data from the Blockchain\n\nNow, let's create a component to read data from the contract. Create a file called `app/components/ReadContract.js`:\n\n```javascript title=\"app/components/ReadContract.js\"\n\n```\n\nThis component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically.\n\nTo see this change in your dApp, you need to integrate this component into the `app/page.js` file:\n\n```javascript title=\"app/page.js\"\n\n\n\n\n```\n\nYour dApp will automatically be updated to the following:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-3.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 8, "depth": 2, "title": "Write Data to the Blockchain", "anchor": "write-data-to-the-blockchain", "start_char": 5940, "end_char": 6681, "estimated_token_count": 181, "token_estimator": "heuristic-v1", "text": "## Write Data to the Blockchain\n\nFinally, let's create a component that allows users to update the stored number. Create a file called `app/components/WriteContract.js`:\n\n```javascript title=\"app/components/WriteContract.js\"\n\n```\n\nThis component allows users to input a new number and send a transaction to update the value stored in the contract. When the transaction is successful, users will see the stored value update in the `ReadContract` component after the transaction is confirmed.\n\nUpdate the `app/page.js` file to integrate all components:\n\n```javascript title=\"app/page.js\"\n\n```\n\nThe completed UI will display:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-4.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 9, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 6681, "end_char": 7495, "estimated_token_count": 171, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've built a complete dApp that interacts with a smart contract on the Polkadot Hub TestNet using Ethers.js and Next.js. Your application can now:\n\n- Connect to a user's wallet.\n- Read data from a smart contract.\n- Send transactions to update the contract state.\n\nThese fundamental skills provide the foundation for building more complex dApps on Polkadot Hub. With these building blocks, you can extend your application to interact with more sophisticated smart contracts and create more advanced user interfaces.\n\nTo get started right away with a working example, you can clone the repository and navigate to the implementation:\n\n```\ngit clone https://github.com/polkadot-developers/polkavm-storage-contract-dapps.git -b v0.0.2\ncd polkavm-storage-contract-dapps/ethers-dapp\n```"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 0, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 890, "end_char": 1375, "estimated_token_count": 115, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore getting started, ensure you have the following:\n\n- [Node.js](https://nodejs.org/en){target=\\_blank} v16 or later installed on your system.\n- A crypto wallet (such as MetaMask) funded with test tokens. Refer to the [Connect to Polkadot](/develop/smart-contracts/connect-to-polkadot){target=\\_blank} guide for more details.\n- A basic understanding of React and JavaScript.\n- Some familiarity with blockchain fundamentals and Solidity (useful but not required)."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 1, "depth": 2, "title": "Project Overview", "anchor": "project-overview", "start_char": 1375, "end_char": 2276, "estimated_token_count": 235, "token_estimator": "heuristic-v1", "text": "## Project Overview\n\nThis dApp will interact with a basic Storage contract. Refer to the [Create Contracts](/tutorials/smart-contracts/launch-your-first-project/create-contracts){target=\\_blank} tutorial for a step-by-step guide on creating this contract. The contract allows:\n\n- Retrieving a stored number from the blockchain.\n- Updating the stored number with a new value.\n\n\nBelow is a high-level overview of what you'll be building:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-1.webp)\n\nYour project directory will be organized as follows:\n\n```bash\nviem-dapp\n├── abis\n│ └── Storage.json\n└── app\n ├── components\n │ ├── ReadContract.tsx\n │ ├── WalletConnect.tsx\n │ └── WriteContract.tsx\n ├── favicon.ico\n ├── globals.css\n ├── layout.tsx\n ├── page.tsx\n └── utils\n ├── contract.ts\n └── viem.ts\n```"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 2, "depth": 2, "title": "Set Up the Project", "anchor": "set-up-the-project", "start_char": 2276, "end_char": 2423, "estimated_token_count": 49, "token_estimator": "heuristic-v1", "text": "## Set Up the Project\n\nCreate a new Next.js project:\n\n```bash\nnpx create-next-app viem-dapp --ts --eslint --tailwind --app --yes\ncd viem-dapp\n```"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 3, "depth": 2, "title": "Install Dependencies", "anchor": "install-dependencies", "start_char": 2423, "end_char": 2567, "estimated_token_count": 38, "token_estimator": "heuristic-v1", "text": "## Install Dependencies\n\nInstall viem and related packages:\n\n```bash\nnpm install viem@2.23.6\nnpm install --save-dev typescript @types/node\n```"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 4, "depth": 2, "title": "Connect to Polkadot Hub", "anchor": "connect-to-polkadot-hub", "start_char": 2567, "end_char": 4571, "estimated_token_count": 491, "token_estimator": "heuristic-v1", "text": "## Connect to Polkadot Hub\n\nTo interact with Polkadot Hub, you need to set up a [Public Client](https://viem.sh/docs/clients/public#public-client){target=\\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/viem.ts` and add the following code:\n\n```typescript title=\"viem.ts\"\nimport { createPublicClient, http, createWalletClient, custom } from 'viem'\nimport 'viem/window';\n\n\nconst transport = http('https://testnet-passet-hub-eth-rpc.polkadot.io')\n\n// Configure the Passet Hub chain\nexport const passetHub = {\n id: 420420422,\n name: 'Passet Hub',\n network: 'passet-hub',\n nativeCurrency: {\n decimals: 18,\n name: 'PAS',\n symbol: 'PAS',\n },\n rpcUrls: {\n default: {\n http: ['https://testnet-passet-hub-eth-rpc.polkadot.io'],\n },\n },\n} as const\n\n// Create a public client for reading data\nexport const publicClient = createPublicClient({\n chain: passetHub,\n transport\n})\n\n// Create a wallet client for signing transactions\nexport const getWalletClient = async () => {\n if (typeof window !== 'undefined' && window.ethereum) {\n const [account] = await window.ethereum.request({ method: 'eth_requestAccounts' });\n return createWalletClient({\n chain: passetHub,\n transport: custom(window.ethereum),\n account,\n });\n }\n throw new Error('No Ethereum browser provider detected');\n};\n```\n\nThis file initializes a viem client, providing helper functions for obtaining a Public Client and a [Wallet Client](https://viem.sh/docs/clients/wallet#wallet-client){target=\\_blank}. The Public Client enables reading blockchain data, while the Wallet Client allows users to sign and send transactions. Also, note that by importing `'viem/window'` the global `window.ethereum` will be typed as an `EIP1193Provider`, check the [`window` Polyfill](https://viem.sh/docs/typescript#window-polyfill){target=\\_blank} reference for more information."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 5, "depth": 2, "title": "Set Up the Smart Contract Interface", "anchor": "set-up-the-smart-contract-interface", "start_char": 4571, "end_char": 7083, "estimated_token_count": 522, "token_estimator": "heuristic-v1", "text": "## Set Up the Smart Contract Interface\n\nFor this dApp, you'll use a simple [Storage contract](/tutorials/smart-contracts/launch-your-first-project/create-contracts){target=\\_blank} that's already deployed in the Polkadot Hub TestNet: `0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f`. To interact with it, you need to define the contract interface.\n\nCreate a folder called `abis` at the root of your project, then create a file named `Storage.json` and paste the corresponding ABI (Application Binary Interface) of the Storage contract. You can copy and paste the following:\n\n??? code \"Storage.sol ABI\"\n ```json title=\"Storage.json\"\n [\n {\n \"inputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"_newNumber\",\n \"type\": \"uint256\"\n }\n ],\n \"name\": \"setNumber\",\n \"outputs\": [],\n \"stateMutability\": \"nonpayable\",\n \"type\": \"function\"\n },\n {\n \"inputs\": [],\n \"name\": \"storedNumber\",\n \"outputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"\",\n \"type\": \"uint256\"\n }\n ],\n \"stateMutability\": \"view\",\n \"type\": \"function\"\n }\n ]\n ```\n\nNext, create a file called `utils/contract.ts`:\n\n```typescript title=\"contract.ts\"\nimport { getContract } from 'viem';\nimport { publicClient, getWalletClient } from './viem';\nimport StorageABI from '../../abis/Storage.json';\n\nexport const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f';\nexport const CONTRACT_ABI = StorageABI;\n\n// Create a function to get a contract instance for reading\nexport const getContractInstance = () => {\n return getContract({\n address: CONTRACT_ADDRESS,\n abi: CONTRACT_ABI,\n client: publicClient,\n });\n};\n\n// Create a function to get a contract instance with a signer for writing\nexport const getSignedContract = async () => {\n const walletClient = await getWalletClient();\n return getContract({\n address: CONTRACT_ADDRESS,\n abi: CONTRACT_ABI,\n client: walletClient,\n });\n};\n```\n\nThis file defines the contract address, ABI, and functions to create a viem [contract instance](https://viem.sh/docs/contract/getContract#contract-instances){target=\\_blank} for reading and writing operations. viem's contract utilities ensure a more efficient and type-safe interaction with smart contracts."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 6, "depth": 2, "title": "Create the Wallet Connection Component", "anchor": "create-the-wallet-connection-component", "start_char": 7083, "end_char": 14170, "estimated_token_count": 1627, "token_estimator": "heuristic-v1", "text": "## Create the Wallet Connection Component\n\nNow, let's create a component to handle wallet connections. Create a new file called `components/WalletConnect.tsx`:\n\n```typescript title=\"WalletConnect.tsx\"\n\"use client\";\n\nimport React, { useState, useEffect } from \"react\";\nimport { passetHub } from \"../utils/viem\";\n\ninterface WalletConnectProps {\n onConnect: (account: string) => void;\n}\n\nconst WalletConnect: React.FC = ({ onConnect }) => {\n const [account, setAccount] = useState(null);\n const [chainId, setChainId] = useState(null);\n const [error, setError] = useState(null);\n\n useEffect(() => {\n // Check if user already has an authorized wallet connection\n const checkConnection = async () => {\n if (typeof window !== 'undefined' && window.ethereum) {\n try {\n // eth_accounts doesn't trigger the wallet popup\n const accounts = await window.ethereum.request({\n method: 'eth_accounts',\n }) as string[];\n \n if (accounts.length > 0) {\n setAccount(accounts[0]);\n const chainIdHex = await window.ethereum.request({\n method: 'eth_chainId',\n }) as string;\n setChainId(parseInt(chainIdHex, 16));\n onConnect(accounts[0]);\n }\n } catch (err) {\n console.error('Error checking connection:', err);\n setError('Failed to check wallet connection');\n }\n }\n };\n\n checkConnection();\n\n if (typeof window !== 'undefined' && window.ethereum) {\n // Setup wallet event listeners\n window.ethereum.on('accountsChanged', (accounts: string[]) => {\n setAccount(accounts[0] || null);\n if (accounts[0]) onConnect(accounts[0]);\n });\n\n window.ethereum.on('chainChanged', (chainIdHex: string) => {\n setChainId(parseInt(chainIdHex, 16));\n });\n }\n\n return () => {\n // Cleanup event listeners\n if (typeof window !== 'undefined' && window.ethereum) {\n window.ethereum.removeListener('accountsChanged', () => {});\n window.ethereum.removeListener('chainChanged', () => {});\n }\n };\n }, [onConnect]);\n\n const connectWallet = async () => {\n if (typeof window === 'undefined' || !window.ethereum) {\n setError(\n 'MetaMask not detected! Please install MetaMask to use this dApp.'\n );\n return;\n }\n\n try {\n // eth_requestAccounts triggers the wallet popup\n const accounts = await window.ethereum.request({\n method: 'eth_requestAccounts',\n }) as string[];\n \n setAccount(accounts[0]);\n\n const chainIdHex = await window.ethereum.request({\n method: 'eth_chainId',\n }) as string;\n \n const currentChainId = parseInt(chainIdHex, 16);\n setChainId(currentChainId);\n\n // Prompt user to switch networks if needed\n if (currentChainId !== passetHub.id) {\n await switchNetwork();\n }\n\n onConnect(accounts[0]);\n } catch (err) {\n console.error('Error connecting to wallet:', err);\n setError('Failed to connect wallet');\n }\n };\n\n const switchNetwork = async () => {\n console.log('Switch network')\n try {\n await window.ethereum.request({\n method: 'wallet_switchEthereumChain',\n params: [{ chainId: `0x${passetHub.id.toString(16)}` }],\n });\n } catch (switchError: any) {\n // Error 4902 means the chain hasn't been added to MetaMask\n if (switchError.code === 4902) {\n try {\n await window.ethereum.request({\n method: 'wallet_addEthereumChain',\n params: [\n {\n chainId: `0x${passetHub.id.toString(16)}`,\n chainName: passetHub.name,\n rpcUrls: [passetHub.rpcUrls.default.http[0]],\n nativeCurrency: {\n name: passetHub.nativeCurrency.name,\n symbol: passetHub.nativeCurrency.symbol,\n decimals: passetHub.nativeCurrency.decimals,\n },\n },\n ],\n });\n } catch (addError) {\n setError('Failed to add network to wallet');\n }\n } else {\n setError('Failed to switch network');\n }\n }\n };\n\n // UI-only disconnection - MetaMask doesn't support programmatic disconnection\n const disconnectWallet = () => {\n setAccount(null);\n };\n\n return (\n
\n {error &&

{error}

}\n\n {!account ? (\n \n Connect Wallet\n \n ) : (\n
\n \n {`${account.substring(0, 6)}...${account.substring(38)}`}\n \n \n Disconnect\n \n {chainId !== passetHub.id && (\n \n Switch to Passet Hub\n \n )}\n
\n )}\n
\n );\n};\n\nexport default WalletConnect;\n```\n\nThis component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. It provides a button for users to connect their wallet and displays the connected account address once connected.\n\nTo use this component in your dApp, replace the existing boilerplate in `app/page.tsx` with the following code:\n\n```typescript title=\"page.tsx\"\n\nimport { useState } from \"react\";\nimport WalletConnect from \"./components/WalletConnect\";\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount: string) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Viem dApp - Passet Hub Smart Contracts\n

\n \n
\n );\n}\n```\n\nNow you're ready to run your dApp. From your project directory, execute:\n\n```bash\nnpm run dev\n```\n\nNavigate to `http://localhost:3000` in your browser, and you should see your dApp with the wallet connection button, the stored number display, and the form to update the number.\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-2.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 7, "depth": 2, "title": "Create the Read Contract Component", "anchor": "create-the-read-contract-component", "start_char": 14170, "end_char": 17647, "estimated_token_count": 854, "token_estimator": "heuristic-v1", "text": "## Create the Read Contract Component\n\nNow, let's create a component to read data from the contract. Create a file called `components/ReadContract.tsx`:\n\n```typescript title=\"ReadContract.tsx\"\n'use client';\n\nimport React, { useState, useEffect } from 'react';\nimport { publicClient } from '../utils/viem';\nimport { CONTRACT_ADDRESS, CONTRACT_ABI } from '../utils/contract';\n\nconst ReadContract: React.FC = () => {\n const [storedNumber, setStoredNumber] = useState(null);\n const [loading, setLoading] = useState(true);\n const [error, setError] = useState(null);\n\n useEffect(() => {\n // Function to read data from the blockchain\n const fetchData = async () => {\n try {\n setLoading(true);\n // Call the smart contract's storedNumber function\n const number = await publicClient.readContract({\n address: CONTRACT_ADDRESS,\n abi: CONTRACT_ABI,\n functionName: 'storedNumber',\n args: [],\n }) as bigint;\n\n setStoredNumber(number.toString());\n setError(null);\n } catch (err) {\n console.error('Error fetching stored number:', err);\n setError('Failed to fetch data from the contract');\n } finally {\n setLoading(false);\n }\n };\n\n fetchData();\n\n // Poll for updates every 10 seconds to keep UI in sync with blockchain\n const interval = setInterval(fetchData, 10000);\n\n // Clean up interval on component unmount\n return () => clearInterval(interval);\n }, []);\n\n return (\n
\n

Contract Data

\n {loading ? (\n
\n
\n
\n ) : error ? (\n

{error}

\n ) : (\n
\n

\n Stored Number: {storedNumber}\n

\n
\n )}\n
\n );\n};\n\nexport default ReadContract;\n```\n\nThis component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically, ensuring that the UI stays in sync with the blockchain state.\n\nTo reflect this change in your dApp, incorporate this component into the `app/page.tsx` file.\n\n```typescript title=\"page.tsx\"\n\nimport { useState } from \"react\";\nimport WalletConnect from \"./components/WalletConnect\";\nimport ReadContract from \"./components/ReadContract\";\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount: string) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Viem dApp - Passet Hub Smart Contracts\n

\n \n \n
\n );\n}\n```\n\nAnd you will see in your browser:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-3.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 8, "depth": 2, "title": "Create the Write Contract Component", "anchor": "create-the-write-contract-component", "start_char": 17647, "end_char": 25635, "estimated_token_count": 1808, "token_estimator": "heuristic-v1", "text": "## Create the Write Contract Component\n\nFinally, let's create a component that allows users to update the stored number. Create a file called `components/WriteContract.tsx`:\n\n```typescript title=\"WriteContract.tsx\"\n\"use client\";\n\nimport React, { useState, useEffect } from \"react\";\nimport { publicClient, getWalletClient } from \"../utils/viem\";\nimport { CONTRACT_ADDRESS, CONTRACT_ABI } from \"../utils/contract\";\n\ninterface WriteContractProps {\n account: string | null;\n}\n\nconst WriteContract: React.FC = ({ account }) => {\n const [newNumber, setNewNumber] = useState(\"\");\n const [status, setStatus] = useState<{\n type: string | null;\n message: string;\n }>({\n type: null,\n message: \"\",\n });\n const [isSubmitting, setIsSubmitting] = useState(false);\n const [isCorrectNetwork, setIsCorrectNetwork] = useState(true);\n\n // Check if the account is on the correct network\n useEffect(() => {\n const checkNetwork = async () => {\n if (!account) return;\n\n try {\n // Get the chainId from the public client\n const chainId = await publicClient.getChainId();\n\n // Get the user's current chainId from their wallet\n const walletClient = await getWalletClient();\n if (!walletClient) return;\n\n const walletChainId = await walletClient.getChainId();\n\n // Check if they match\n setIsCorrectNetwork(chainId === walletChainId);\n } catch (err) {\n console.error(\"Error checking network:\", err);\n setIsCorrectNetwork(false);\n }\n };\n\n checkNetwork();\n }, [account]);\n\n const handleSubmit = async (e: React.FormEvent) => {\n e.preventDefault();\n\n // Validation checks\n if (!account) {\n setStatus({ type: \"error\", message: \"Please connect your wallet first\" });\n return;\n }\n\n if (!isCorrectNetwork) {\n setStatus({\n type: \"error\",\n message: \"Please switch to the correct network in your wallet\",\n });\n return;\n }\n\n if (!newNumber || isNaN(Number(newNumber))) {\n setStatus({ type: \"error\", message: \"Please enter a valid number\" });\n return;\n }\n\n try {\n setIsSubmitting(true);\n setStatus({ type: \"info\", message: \"Initiating transaction...\" });\n\n // Get wallet client for transaction signing\n const walletClient = await getWalletClient();\n\n if (!walletClient) {\n setStatus({ type: \"error\", message: \"Wallet client not available\" });\n return;\n }\n\n // Check if account matches\n if (\n walletClient.account?.address.toLowerCase() !== account.toLowerCase()\n ) {\n setStatus({\n type: \"error\",\n message:\n \"Connected wallet account doesn't match the selected account\",\n });\n return;\n }\n\n // Prepare transaction and wait for user confirmation in wallet\n setStatus({\n type: \"info\",\n message: \"Please confirm the transaction in your wallet...\",\n });\n\n // Simulate the contract call first\n console.log('newNumber', newNumber);\n const { request } = await publicClient.simulateContract({\n address: CONTRACT_ADDRESS,\n abi: CONTRACT_ABI,\n functionName: \"setNumber\",\n args: [BigInt(newNumber)],\n account: walletClient.account,\n });\n\n // Send the transaction with wallet client\n const hash = await walletClient.writeContract(request);\n\n // Wait for transaction to be mined\n setStatus({\n type: \"info\",\n message: \"Transaction submitted. Waiting for confirmation...\",\n });\n\n const receipt = await publicClient.waitForTransactionReceipt({\n hash,\n });\n\n setStatus({\n type: \"success\",\n message: `Transaction confirmed! Transaction hash: ${receipt.transactionHash}`,\n });\n\n setNewNumber(\"\");\n } catch (err: any) {\n console.error(\"Error updating number:\", err);\n\n // Handle specific errors\n if (err.code === 4001) {\n // User rejected transaction\n setStatus({ type: \"error\", message: \"Transaction rejected by user.\" });\n } else if (err.message?.includes(\"Account not found\")) {\n // Account not found on the network\n setStatus({\n type: \"error\",\n message:\n \"Account not found on current network. Please check your wallet is connected to the correct network.\",\n });\n } else if (err.message?.includes(\"JSON is not a valid request object\")) {\n // JSON error - specific to your current issue\n setStatus({\n type: \"error\",\n message:\n \"Invalid request format. Please try again or contact support.\",\n });\n } else {\n // Other errors\n setStatus({\n type: \"error\",\n message: `Error: ${err.message || \"Failed to send transaction\"}`,\n });\n }\n } finally {\n setIsSubmitting(false);\n }\n };\n\n return (\n
\n

Update Stored Number

\n\n {!isCorrectNetwork && account && (\n
\n ⚠️ You are not connected to the correct network. Please switch\n networks in your wallet.\n
\n )}\n\n {status.message && (\n \n {status.message}\n
\n )}\n\n
\n setNewNumber(e.target.value)}\n disabled={isSubmitting || !account}\n className=\"w-full p-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-pink-400\"\n />\n \n {isSubmitting ? \"Updating...\" : \"Update\"}\n \n \n\n {!account && (\n

\n Connect your wallet to update the stored number.\n

\n )}\n
\n );\n};\n\nexport default WriteContract;\n```\n\nThis component allows users to input a new number and send a transaction to update the value stored in the contract. It provides appropriate feedback during each step of the transaction process and handles error scenarios.\n\nUpdate the `app/page.tsx` file to integrate all components:\n\n```typescript title=\"page.tsx\"\n\"use client\";\n\nimport { useState } from \"react\";\nimport WalletConnect from \"./components/WalletConnect\";\nimport ReadContract from \"./components/ReadContract\";\nimport WriteContract from \"./components/WriteContract\";\n\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount: string) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Viem dApp - Passet Hub Smart Contracts\n

\n \n \n \n
\n );\n}\n```\nAfter that, you will see:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-4.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 9, "depth": 2, "title": "How It Works", "anchor": "how-it-works", "start_char": 25635, "end_char": 26779, "estimated_token_count": 217, "token_estimator": "heuristic-v1", "text": "## How It Works\n\nLet's examine how the dApp interacts with the blockchain:\n\n1. Wallet connection: \n\n - The `WalletConnect` component uses the browser's Ethereum provider (MetaMask) to connect to the user's wallet.\n - It handles network switching to ensure the user is connected to the Polkadot Hub TestNet.\n - Once connected, it provides the user's account address to the parent component.\n\n2. Reading data:\n\n - The `ReadContract` component uses viem's `readContract` function to call the `storedNumber` view function.\n - It periodically polls for updates to keep the UI in sync with the blockchain state.\n - The component displays a loading indicator while fetching data and handles error states.\n\n3. Writing data:\n\n - The `WriteContract` component uses viem's `writeContract` function to send a transaction to the `setNumber` function.\n - It ensures the wallet is connected before allowing a transaction.\n - The component shows detailed feedback during transaction submission and confirmation.\n - After a successful transaction, the value displayed in the `ReadContract` component will update on the next poll."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 10, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 26779, "end_char": 27636, "estimated_token_count": 175, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've successfully built a fully functional dApp that interacts with a smart contract on Polkadot Hub using viem and Next.js. Your application can now:\n\n- Connect to a user's wallet and handle network switching.\n- Read data from a smart contract and keep it updated.\n- Write data to the blockchain through transactions.\n\nThese fundamental skills provide the foundation for building more complex dApps on Polkadot Hub. With this knowledge, you can extend your application to interact with more sophisticated smart contracts and create advanced user interfaces.\n\nTo get started right away with a working example, you can clone the repository and navigate to the implementation:\n\n```\ngit clone https://github.com/polkadot-developers/polkavm-storage-contract-dapps.git -b v0.0.2\ncd polkavm-storage-contract-dapps/viem-dapp\n```"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 11, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 27636, "end_char": 27950, "estimated_token_count": 81, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n
\n\n- Guide __Create a dApp with Wagmi__\n\n ---\n\n Learn how to build a decentralized application by using the Wagmi framework.\n\n [:octicons-arrow-right-24: Get Started](/develop/smart-contracts/libraries/wagmi)\n\n
"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 4, "depth": 2, "title": "Connect to Polkadot Hub", "anchor": "connect-to-polkadot-hub", "start_char": 2567, "end_char": 3520, "estimated_token_count": 238, "token_estimator": "heuristic-v1", "text": "## Connect to Polkadot Hub\n\nTo interact with Polkadot Hub, you need to set up a [Public Client](https://viem.sh/docs/clients/public#public-client){target=\\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/viem.ts` and add the following code:\n\n```typescript title=\"viem.ts\"\n\n```\n\nThis file initializes a viem client, providing helper functions for obtaining a Public Client and a [Wallet Client](https://viem.sh/docs/clients/wallet#wallet-client){target=\\_blank}. The Public Client enables reading blockchain data, while the Wallet Client allows users to sign and send transactions. Also, note that by importing `'viem/window'` the global `window.ethereum` will be typed as an `EIP1193Provider`, check the [`window` Polyfill](https://viem.sh/docs/typescript#window-polyfill){target=\\_blank} reference for more information."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 5, "depth": 2, "title": "Set Up the Smart Contract Interface", "anchor": "set-up-the-smart-contract-interface", "start_char": 3520, "end_char": 5274, "estimated_token_count": 373, "token_estimator": "heuristic-v1", "text": "## Set Up the Smart Contract Interface\n\nFor this dApp, you'll use a simple [Storage contract](/tutorials/smart-contracts/launch-your-first-project/create-contracts){target=\\_blank} that's already deployed in the Polkadot Hub TestNet: `0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f`. To interact with it, you need to define the contract interface.\n\nCreate a folder called `abis` at the root of your project, then create a file named `Storage.json` and paste the corresponding ABI (Application Binary Interface) of the Storage contract. You can copy and paste the following:\n\n??? code \"Storage.sol ABI\"\n ```json title=\"Storage.json\"\n [\n {\n \"inputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"_newNumber\",\n \"type\": \"uint256\"\n }\n ],\n \"name\": \"setNumber\",\n \"outputs\": [],\n \"stateMutability\": \"nonpayable\",\n \"type\": \"function\"\n },\n {\n \"inputs\": [],\n \"name\": \"storedNumber\",\n \"outputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"\",\n \"type\": \"uint256\"\n }\n ],\n \"stateMutability\": \"view\",\n \"type\": \"function\"\n }\n ]\n ```\n\nNext, create a file called `utils/contract.ts`:\n\n```typescript title=\"contract.ts\"\n\n```\n\nThis file defines the contract address, ABI, and functions to create a viem [contract instance](https://viem.sh/docs/contract/getContract#contract-instances){target=\\_blank} for reading and writing operations. viem's contract utilities ensure a more efficient and type-safe interaction with smart contracts."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 6, "depth": 2, "title": "Create the Wallet Connection Component", "anchor": "create-the-wallet-connection-component", "start_char": 5274, "end_char": 6261, "estimated_token_count": 230, "token_estimator": "heuristic-v1", "text": "## Create the Wallet Connection Component\n\nNow, let's create a component to handle wallet connections. Create a new file called `components/WalletConnect.tsx`:\n\n```typescript title=\"WalletConnect.tsx\"\n\n```\n\nThis component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. It provides a button for users to connect their wallet and displays the connected account address once connected.\n\nTo use this component in your dApp, replace the existing boilerplate in `app/page.tsx` with the following code:\n\n```typescript title=\"page.tsx\"\n\n\n\n\n```\n\nNow you're ready to run your dApp. From your project directory, execute:\n\n```bash\nnpm run dev\n```\n\nNavigate to `http://localhost:3000` in your browser, and you should see your dApp with the wallet connection button, the stored number display, and the form to update the number.\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-2.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 7, "depth": 2, "title": "Create the Read Contract Component", "anchor": "create-the-read-contract-component", "start_char": 6261, "end_char": 6962, "estimated_token_count": 172, "token_estimator": "heuristic-v1", "text": "## Create the Read Contract Component\n\nNow, let's create a component to read data from the contract. Create a file called `components/ReadContract.tsx`:\n\n```typescript title=\"ReadContract.tsx\"\n\n```\n\nThis component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically, ensuring that the UI stays in sync with the blockchain state.\n\nTo reflect this change in your dApp, incorporate this component into the `app/page.tsx` file.\n\n```typescript title=\"page.tsx\"\n\n\n\n\n```\n\nAnd you will see in your browser:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-3.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 8, "depth": 2, "title": "Create the Write Contract Component", "anchor": "create-the-write-contract-component", "start_char": 6962, "end_char": 14164, "estimated_token_count": 1627, "token_estimator": "heuristic-v1", "text": "## Create the Write Contract Component\n\nFinally, let's create a component that allows users to update the stored number. Create a file called `components/WriteContract.tsx`:\n\n```typescript title=\"WriteContract.tsx\"\n\"use client\";\n\nimport React, { useState, useEffect } from \"react\";\nimport { publicClient, getWalletClient } from \"../utils/viem\";\nimport { CONTRACT_ADDRESS, CONTRACT_ABI } from \"../utils/contract\";\n\ninterface WriteContractProps {\n account: string | null;\n}\n\nconst WriteContract: React.FC = ({ account }) => {\n const [newNumber, setNewNumber] = useState(\"\");\n const [status, setStatus] = useState<{\n type: string | null;\n message: string;\n }>({\n type: null,\n message: \"\",\n });\n const [isSubmitting, setIsSubmitting] = useState(false);\n const [isCorrectNetwork, setIsCorrectNetwork] = useState(true);\n\n // Check if the account is on the correct network\n useEffect(() => {\n const checkNetwork = async () => {\n if (!account) return;\n\n try {\n // Get the chainId from the public client\n const chainId = await publicClient.getChainId();\n\n // Get the user's current chainId from their wallet\n const walletClient = await getWalletClient();\n if (!walletClient) return;\n\n const walletChainId = await walletClient.getChainId();\n\n // Check if they match\n setIsCorrectNetwork(chainId === walletChainId);\n } catch (err) {\n console.error(\"Error checking network:\", err);\n setIsCorrectNetwork(false);\n }\n };\n\n checkNetwork();\n }, [account]);\n\n const handleSubmit = async (e: React.FormEvent) => {\n e.preventDefault();\n\n // Validation checks\n if (!account) {\n setStatus({ type: \"error\", message: \"Please connect your wallet first\" });\n return;\n }\n\n if (!isCorrectNetwork) {\n setStatus({\n type: \"error\",\n message: \"Please switch to the correct network in your wallet\",\n });\n return;\n }\n\n if (!newNumber || isNaN(Number(newNumber))) {\n setStatus({ type: \"error\", message: \"Please enter a valid number\" });\n return;\n }\n\n try {\n setIsSubmitting(true);\n setStatus({ type: \"info\", message: \"Initiating transaction...\" });\n\n // Get wallet client for transaction signing\n const walletClient = await getWalletClient();\n\n if (!walletClient) {\n setStatus({ type: \"error\", message: \"Wallet client not available\" });\n return;\n }\n\n // Check if account matches\n if (\n walletClient.account?.address.toLowerCase() !== account.toLowerCase()\n ) {\n setStatus({\n type: \"error\",\n message:\n \"Connected wallet account doesn't match the selected account\",\n });\n return;\n }\n\n // Prepare transaction and wait for user confirmation in wallet\n setStatus({\n type: \"info\",\n message: \"Please confirm the transaction in your wallet...\",\n });\n\n // Simulate the contract call first\n console.log('newNumber', newNumber);\n const { request } = await publicClient.simulateContract({\n address: CONTRACT_ADDRESS,\n abi: CONTRACT_ABI,\n functionName: \"setNumber\",\n args: [BigInt(newNumber)],\n account: walletClient.account,\n });\n\n // Send the transaction with wallet client\n const hash = await walletClient.writeContract(request);\n\n // Wait for transaction to be mined\n setStatus({\n type: \"info\",\n message: \"Transaction submitted. Waiting for confirmation...\",\n });\n\n const receipt = await publicClient.waitForTransactionReceipt({\n hash,\n });\n\n setStatus({\n type: \"success\",\n message: `Transaction confirmed! Transaction hash: ${receipt.transactionHash}`,\n });\n\n setNewNumber(\"\");\n } catch (err: any) {\n console.error(\"Error updating number:\", err);\n\n // Handle specific errors\n if (err.code === 4001) {\n // User rejected transaction\n setStatus({ type: \"error\", message: \"Transaction rejected by user.\" });\n } else if (err.message?.includes(\"Account not found\")) {\n // Account not found on the network\n setStatus({\n type: \"error\",\n message:\n \"Account not found on current network. Please check your wallet is connected to the correct network.\",\n });\n } else if (err.message?.includes(\"JSON is not a valid request object\")) {\n // JSON error - specific to your current issue\n setStatus({\n type: \"error\",\n message:\n \"Invalid request format. Please try again or contact support.\",\n });\n } else {\n // Other errors\n setStatus({\n type: \"error\",\n message: `Error: ${err.message || \"Failed to send transaction\"}`,\n });\n }\n } finally {\n setIsSubmitting(false);\n }\n };\n\n return (\n
\n

Update Stored Number

\n\n {!isCorrectNetwork && account && (\n
\n ⚠️ You are not connected to the correct network. Please switch\n networks in your wallet.\n
\n )}\n\n {status.message && (\n \n {status.message}\n
\n )}\n\n
\n setNewNumber(e.target.value)}\n disabled={isSubmitting || !account}\n className=\"w-full p-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-pink-400\"\n />\n \n {isSubmitting ? \"Updating...\" : \"Update\"}\n \n \n\n {!account && (\n

\n Connect your wallet to update the stored number.\n

\n )}\n
\n );\n};\n\nexport default WriteContract;\n```\n\nThis component allows users to input a new number and send a transaction to update the value stored in the contract. It provides appropriate feedback during each step of the transaction process and handles error scenarios.\n\nUpdate the `app/page.tsx` file to integrate all components:\n\n```typescript title=\"page.tsx\"\n\n```\nAfter that, you will see:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-4.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 9, "depth": 2, "title": "How It Works", "anchor": "how-it-works", "start_char": 14164, "end_char": 15308, "estimated_token_count": 217, "token_estimator": "heuristic-v1", "text": "## How It Works\n\nLet's examine how the dApp interacts with the blockchain:\n\n1. Wallet connection: \n\n - The `WalletConnect` component uses the browser's Ethereum provider (MetaMask) to connect to the user's wallet.\n - It handles network switching to ensure the user is connected to the Polkadot Hub TestNet.\n - Once connected, it provides the user's account address to the parent component.\n\n2. Reading data:\n\n - The `ReadContract` component uses viem's `readContract` function to call the `storedNumber` view function.\n - It periodically polls for updates to keep the UI in sync with the blockchain state.\n - The component displays a loading indicator while fetching data and handles error states.\n\n3. Writing data:\n\n - The `WriteContract` component uses viem's `writeContract` function to send a transaction to the `setNumber` function.\n - It ensures the wallet is connected before allowing a transaction.\n - The component shows detailed feedback during transaction submission and confirmation.\n - After a successful transaction, the value displayed in the `ReadContract` component will update on the next poll."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 10, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 15308, "end_char": 16165, "estimated_token_count": 175, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've successfully built a fully functional dApp that interacts with a smart contract on Polkadot Hub using viem and Next.js. Your application can now:\n\n- Connect to a user's wallet and handle network switching.\n- Read data from a smart contract and keep it updated.\n- Write data to the blockchain through transactions.\n\nThese fundamental skills provide the foundation for building more complex dApps on Polkadot Hub. With this knowledge, you can extend your application to interact with more sophisticated smart contracts and create advanced user interfaces.\n\nTo get started right away with a working example, you can clone the repository and navigate to the implementation:\n\n```\ngit clone https://github.com/polkadot-developers/polkavm-storage-contract-dapps.git -b v0.0.2\ncd polkavm-storage-contract-dapps/viem-dapp\n```"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 11, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 16165, "end_char": 16479, "estimated_token_count": 81, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n
\n\n- Guide __Create a dApp with Wagmi__\n\n ---\n\n Learn how to build a decentralized application by using the Wagmi framework.\n\n [:octicons-arrow-right-24: Get Started](/develop/smart-contracts/libraries/wagmi)\n\n
"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 202, "end_char": 841, "estimated_token_count": 136, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nAfter creating a smart contract, the next crucial steps are testing and deployment. Proper testing ensures your contract behaves as expected, while deployment makes your contract available on the blockchain. This tutorial will guide you through using Hardhat, a popular development environment, to test and deploy the `Storage.sol` contract you created in the [Create a Smart Contract](/tutorials/smart-contracts/launch-your-first-project/create-contracts/){target=\\_blank} tutorial. For more information about Hardhat usage, check the [Hardhat guide](/develop/smart-contracts/dev-environments/hardhat/){target=\\_blank}."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 841, "end_char": 1367, "estimated_token_count": 147, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore starting, make sure you have:\n\n- The [`Storage.sol` contract](/tutorials/smart-contracts/launch-your-first-project/create-contracts/#create-the-smart-contract){target=\\_blank} created in the previous tutorial.\n- [Node.js](https://nodejs.org/){target=\\_blank} (v16.0.0 or later) and npm installed.\n- Basic understanding of JavaScript for writing tests.\n- Some PAS test tokens to cover transaction fees (obtained from the [Polkadot faucet](https://faucet.polkadot.io/?parachain=1111){target=\\_blank})."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 2, "depth": 2, "title": "Setting Up the Development Environment", "anchor": "setting-up-the-development-environment", "start_char": 1367, "end_char": 4508, "estimated_token_count": 706, "token_estimator": "heuristic-v1", "text": "## Setting Up the Development Environment\n\nLet's start by setting up Hardhat for your Storage contract project:\n\n1. Create a new directory for your project and navigate into it:\n\n ```bash\n mkdir storage-hardhat\n cd storage-hardhat\n ```\n\n2. Initialize a new npm project:\n\n ```bash\n npm init -y\n ```\n\n3. Install `hardhat-polkadot` and all required plugins:\n\n ```bash\n npm install --save-dev @parity/hardhat-polkadot@0.1.9 solc@0.8.28\n ```\n\n For dependencies compatibility, ensure to install the `@nomicfoundation/hardhat-toolbox` dependency with the `--force` flag:\n\n ```bash\n npm install --force @nomicfoundation/hardhat-toolbox \n ```\n\n5. Initialize a Hardhat project:\n\n ```bash\n npx hardhat-polkadot init\n ```\n\n Select **Create an empty hardhat.config.js** when prompted.\n\n6. Configure Hardhat by updating the `hardhat.config.js` file:\n\n ```javascript title=\"hardhat.config.js\"\n require(\"@nomicfoundation/hardhat-toolbox\");\n\n require(\"@parity/hardhat-polkadot\");\n\n const { vars } = require(\"hardhat/config\");\n\n /** @type import('hardhat/config').HardhatUserConfig */\n module.exports = {\n solidity: \"0.8.28\",\n resolc: {\n compilerSource: \"npm\",\n },\n networks: {\n hardhat: {\n polkavm: true,\n nodeConfig: {\n nodeBinaryPath: 'INSERT_PATH_TO_SUBSTRATE_NODE',\n rpcPort: 8000,\n dev: true,\n },\n adapterConfig: {\n adapterBinaryPath: 'INSERT_PATH_TO_ETH_RPC_ADAPTER',\n dev: true,\n },\n },\n localNode: {\n polkavm: true,\n url: `http://127.0.0.1:8545`,\n },\n passetHub: {\n polkavm: true,\n url: 'https://testnet-passet-hub-eth-rpc.polkadot.io',\n accounts: [vars.get(\"PRIVATE_KEY\")],\n },\n },\n };\n ```\n\n Ensure that `INSERT_PATH_TO_SUBSTRATE_NODE` and `INSERT_PATH_TO_ETH_RPC_ADAPTER` are replaced with the proper paths to the compiled binaries. \n\n If you need to build these binaries, follow the [Installation](/develop/smart-contracts/local-development-node#install-the-substrate-node-and-eth-rpc-adapter){target=\\_blank} section on the Local Development Node page.\n\n The configuration also defines two network settings: \n\n - **`localNode`**: Runs a PolkaVM instance on `http://127.0.0.1:8545` for local development and testing.\n - **`passetHub`**: Connects to the the Polkadot Hub TestNet network using a predefined RPC URL and a private key stored in environment variables.\n\n7. Export your private key and save it in your Hardhat environment:\n\n ```bash\n npx hardhat vars set PRIVATE_KEY \"INSERT_PRIVATE_KEY\"\n ```\n\n Replace `INSERT_PRIVATE_KEY` with your actual private key. \n \n For further details on private key exportation, refer to the article [How to export an account's private key](https://support.metamask.io/configure/accounts/how-to-export-an-accounts-private-key/){target=\\_blank}.\n\n !!! warning\n Keep your private key safe, and never share it with anyone. If it is compromised, your funds can be stolen."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 3, "depth": 2, "title": "Adding the Smart Contract", "anchor": "adding-the-smart-contract", "start_char": 4508, "end_char": 5883, "estimated_token_count": 293, "token_estimator": "heuristic-v1", "text": "## Adding the Smart Contract\n\n1. Create a new folder called `contracts` and create a `Storage.sol` file. Add the contract code from the previous tutorial:\n\n ```solidity title=\"Storage.sol\"\n // SPDX-License-Identifier: MIT\n pragma solidity ^0.8.28;\n\n contract Storage {\n // State variable to store our number\n uint256 private number;\n\n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n\n // Function to store a new number\n function store(uint256 newNumber) public {\n number = newNumber;\n emit NumberChanged(newNumber);\n }\n\n // Function to retrieve the stored number\n function retrieve() public view returns (uint256) {\n return number;\n }\n }\n ```\n\n2. Compile the contract:\n\n ```bash\n npx hardhat compile\n ```\n\n3. If successful, you will see the following output in your terminal:\n\n
\n npx hardhat compile\n Compiling 1 Solidity file\n Successfully compiled 1 Solidity file\n
\n\nAfter compilation, the `artifacts-pvm` and `cache-pvm` folders, containing the metadata and binary files of your compiled contract, will be created in the root of your project."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 4, "depth": 2, "title": "Writing Tests", "anchor": "writing-tests", "start_char": 5883, "end_char": 12912, "estimated_token_count": 1480, "token_estimator": "heuristic-v1", "text": "## Writing Tests\n\nTesting is a critical part of smart contract development. Hardhat makes it easy to write tests in JavaScript using frameworks like [Mocha](https://mochajs.org/){target=\\_blank} and [Chai](https://www.chaijs.com/){target=\\_blank}.\n\n1. Create a folder for testing called `test`. Inside that directory, create a file named `Storage.js` and add the following code:\n\n ```javascript title=\"Storage.js\" \n const { expect } = require('chai');\n const { ethers } = require('hardhat');\n\n describe('Storage', function () {\n let storage;\n let owner;\n let addr1;\n\n beforeEach(async function () {\n // Get signers\n [owner, addr1] = await ethers.getSigners();\n\n // Deploy the Storage contract\n const Storage = await ethers.getContractFactory('Storage');\n storage = await Storage.deploy();\n await storage.waitForDeployment();\n });\n\n describe('Basic functionality', function () {\n // Add your logic here\n });\n });\n ```\n\n The `beforeEach` hook ensures stateless contract execution by redeploying a fresh instance of the Storage contract before each test case. This approach guarantees that each test starts with a clean and independent contract state by using `ethers.getSigners()` to obtain test accounts and `ethers.getContractFactory('Storage').deploy()` to create a new contract instance.\n\n Now, you can add custom unit tests to check your contract functionality. Some example tests are available below:\n\n 1. **Initial state verification**: Ensures that the contract starts with a default value of zero, which is a fundamental expectation for the `Storage.sol` contract.\n\n ```javascript title=\"Storage.js\"\n it('Should return 0 initially', async function () {\n expect(await storage.retrieve()).to.equal(0);\n });\n ```\n\n Explanation:\n\n - Checks the initial state of the contract.\n - Verifies that a newly deployed contract has a default value of 0.\n - Confirms the `retrieve()` method works correctly for a new contract.\n\n 2. **Value storage test**: Validate the core functionality of storing and retrieving a value in the contract.\n\n ```javascript title=\"Storage.js\"\n it('Should update when store is called', async function () {\n const testValue = 42;\n // Store a value\n await storage.store(testValue);\n // Check if the value was updated\n expect(await storage.retrieve()).to.equal(testValue);\n });\n ```\n\n Explanation:\n\n - Demonstrates the ability to store a specific value.\n - Checks that the stored value can be retrieved correctly.\n - Verifies the basic write and read functionality of the contract.\n\n 3. **Event emission verification**: Confirm that the contract emits the correct event when storing a value, which is crucial for off-chain tracking.\n\n ```javascript title=\"Storage.js\"\n it('Should emit an event when storing a value', async function () {\n const testValue = 100;\n // Check if the NumberChanged event is emitted with the correct value\n await expect(storage.store(testValue))\n .to.emit(storage, 'NumberChanged')\n .withArgs(testValue);\n });\n ```\n\n Explanation:\n\n - Ensures the `NumberChanged` event is emitted during storage.\n - Verifies that the event contains the correct stored value.\n - Validates the contract's event logging mechanism.\n\n 4. **Sequential value storage test**: Check the contract's ability to store multiple values sequentially and maintain the most recent value.\n\n ```javascript title=\"Storage.js\"\n it('Should allow storing sequentially increasing values', async function () {\n const values = [10, 20, 30, 40];\n\n for (const value of values) {\n await storage.store(value);\n expect(await storage.retrieve()).to.equal(value);\n }\n });\n ```\n\n Explanation:\n\n - Verifies that multiple values can be stored in sequence.\n - Confirms that each new store operation updates the contract's state.\n - Demonstrates the contract's ability always to reflect the most recently stored value.\n\n The complete `test/Storage.js` should look like this:\n\n ???--- code \"View complete script\"\n ```javascript title=\"Storage.js\"\n const { expect } = require('chai');\n const { ethers } = require('hardhat');\n\n describe('Storage', function () {\n let storage;\n let owner;\n let addr1;\n\n beforeEach(async function () {\n // Get signers\n [owner, addr1] = await ethers.getSigners();\n\n // Deploy the Storage contract\n const Storage = await ethers.getContractFactory('Storage');\n storage = await Storage.deploy();\n await storage.waitForDeployment();\n });\n\n describe('Basic functionality', function () {\n it('Should return 0 initially', async function () {\n expect(await storage.retrieve()).to.equal(0);\n });\n\n it('Should update when store is called', async function () {\n const testValue = 42;\n // Store a value\n await storage.store(testValue);\n // Check if the value was updated\n expect(await storage.retrieve()).to.equal(testValue);\n });\n\n it('Should emit an event when storing a value', async function () {\n const testValue = 100;\n // Check if the NumberChanged event is emitted with the correct value\n await expect(storage.store(testValue))\n .to.emit(storage, 'NumberChanged')\n .withArgs(testValue);\n });\n\n it('Should allow storing sequentially increasing values', async function () {\n const values = [10, 20, 30, 40];\n\n for (const value of values) {\n await storage.store(value);\n expect(await storage.retrieve()).to.equal(value);\n }\n });\n });\n });\n ```\n\n2. Run the tests:\n\n ```bash\n npx hardhat test\n ```\n\n3. After running the above command, you will see the output showing that all tests have passed:\n\n
\n npx hardhat test\n Storage\n Basic functionality\n ✔ Should return 0 initially\n ✔ Should update when store is called (1126ms)\n ✔ Should emit an event when storing a value (1131ms)\n ✔ Should allow storing sequentially increasing values (12477ms)\n 4 passing (31s) \n
"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 5, "depth": 2, "title": "Deploying with Ignition", "anchor": "deploying-with-ignition", "start_char": 12912, "end_char": 16132, "estimated_token_count": 786, "token_estimator": "heuristic-v1", "text": "## Deploying with Ignition\n\n[Hardhat's Ignition](https://hardhat.org/ignition/docs/getting-started#overview){target=\\_blank} is a deployment system designed to make deployments predictable and manageable. Let's create a deployment script:\n\n1. Create a new folder called`ignition/modules`. Add a new file named `StorageModule.js` with the following logic:\n\n ```javascript title=\"StorageModule.js\"\n const { buildModule } = require('@nomicfoundation/hardhat-ignition/modules');\n\n module.exports = buildModule('StorageModule', (m) => {\n const storage = m.contract('Storage');\n\n return { storage };\n });\n ```\n\n2. Deploy to the local network:\n\n 1. First, start a local node:\n\n ```bash\n npx hardhat node\n ```\n\n 2. Then, in a new terminal window, deploy the contract:\n\n ```bash\n npx hardhat ignition deploy ./ignition/modules/StorageModule.js --network localNode\n ```\n\n 3. If successful, output similar to the following will display in your terminal:\n\n
\n npx hardhat ignition deploy ./ignition/modules/Storage.js --network localNode\n ✔ Confirm deploy to network localNode (420420422)? … yes\n \n Hardhat Ignition 🚀\n \n Deploying [ StorageModule ]\n \n Batch #1\n Executed StorageModule#Storage\n \n [ StorageModule ] successfully deployed 🚀\n \n Deployed Addresses\n \n StorageModule#Storage - 0xc01Ee7f10EA4aF4673cFff62710E1D7792aBa8f3\n
\n\n3. Deploy to the Polkadot Hub TestNet:\n\n 1. Make sure your account has enough PAS tokens for gas fees, then run:\n\n ```bash\n npx hardhat ignition deploy ./ignition/modules/StorageModule.js --network passetHub\n ```\n\n 2. After deployment, you'll see the contract address in the console output. Save this address for future interactions.\n\n
\n npx hardhat ignition deploy ./ignition/modules/Storage.js --network passetHub\n ✔ Confirm deploy to network localNode (420420422)? … yes\n \n Hardhat Ignition 🚀\n \n Deploying [ StorageModule ]\n \n Batch #1\n Executed StorageModule#Storage\n \n [ StorageModule ] successfully deployed 🚀\n \n Deployed Addresses\n \n StorageModule#Storage - 0xE8693cE64b294E26765573398C7Ca5C700E9C85c\n
"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 6, "depth": 2, "title": "Interacting with Your Deployed Contract", "anchor": "interacting-with-your-deployed-contract", "start_char": 16132, "end_char": 18086, "estimated_token_count": 460, "token_estimator": "heuristic-v1", "text": "## Interacting with Your Deployed Contract\n\nTo interact with your deployed contract:\n\n1. Create a new folder named `scripts` and add the `interact.js` with the following content:\n\n ```javascript title=\"interact.js\"\n const hre = require('hardhat');\n\n async function main() {\n // Replace with your deployed contract address\n const contractAddress = 'INSERT_DEPLOYED_CONTRACT_ADDRESS';\n\n // Get the contract instance\n const Storage = await hre.ethers.getContractFactory('Storage');\n const storage = await Storage.attach(contractAddress);\n\n // Get current value\n const currentValue = await storage.retrieve();\n console.log('Current stored value:', currentValue.toString());\n\n // Store a new value\n const newValue = 42;\n console.log(`Storing new value: ${newValue}...`);\n const tx = await storage.store(newValue);\n\n // Wait for transaction to be mined\n await tx.wait();\n console.log('Transaction confirmed');\n\n // Get updated value\n const updatedValue = await storage.retrieve();\n console.log('Updated stored value:', updatedValue.toString());\n }\n\n main()\n .then(() => process.exit(0))\n .catch((error) => {\n console.error(error);\n process.exit(1);\n });\n ```\n\n Ensure that `INSERT_DEPLOYED_CONTRACT_ADDRESS` is replaced with the value obtained in the previous step.\n\n2. Run the interaction script:\n\n ```bash\n npx hardhat run scripts/interact.js --network passetHub\n ```\n\n3. If successful, the terminal will show the following output:\n\n
\n npx hardhat run scripts/interact.js --network passetHub\n Current stored value: 0\n Storing new value: 42...\n Transaction confirmed\n Updated stored value: 42\n
"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 7, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 18086, "end_char": 18691, "estimated_token_count": 122, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've successfully set up a Hardhat development environment, written comprehensive tests for your Storage contract, and deployed it to local and Polkadot Hub TestNet networks. This tutorial covered essential steps in smart contract development, including configuration, testing, deployment, and interaction.\n\nTo get started with a working example right away, you can clone the repository and navigate to the project directory:\n\n```bash\ngit clone https://github.com/polkadot-developers/polkavm-hardhat-examples.git -b v0.0.8\ncd polkavm-hardhat-examples/storage-hardhat\n```"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 2, "depth": 2, "title": "Setting Up the Development Environment", "anchor": "setting-up-the-development-environment", "start_char": 1367, "end_char": 3576, "estimated_token_count": 504, "token_estimator": "heuristic-v1", "text": "## Setting Up the Development Environment\n\nLet's start by setting up Hardhat for your Storage contract project:\n\n1. Create a new directory for your project and navigate into it:\n\n ```bash\n mkdir storage-hardhat\n cd storage-hardhat\n ```\n\n2. Initialize a new npm project:\n\n ```bash\n npm init -y\n ```\n\n3. Install `hardhat-polkadot` and all required plugins:\n\n ```bash\n npm install --save-dev @parity/hardhat-polkadot@0.1.9 solc@0.8.28\n ```\n\n For dependencies compatibility, ensure to install the `@nomicfoundation/hardhat-toolbox` dependency with the `--force` flag:\n\n ```bash\n npm install --force @nomicfoundation/hardhat-toolbox \n ```\n\n5. Initialize a Hardhat project:\n\n ```bash\n npx hardhat-polkadot init\n ```\n\n Select **Create an empty hardhat.config.js** when prompted.\n\n6. Configure Hardhat by updating the `hardhat.config.js` file:\n\n ```javascript title=\"hardhat.config.js\"\n \n ```\n\n Ensure that `INSERT_PATH_TO_SUBSTRATE_NODE` and `INSERT_PATH_TO_ETH_RPC_ADAPTER` are replaced with the proper paths to the compiled binaries. \n\n If you need to build these binaries, follow the [Installation](/develop/smart-contracts/local-development-node#install-the-substrate-node-and-eth-rpc-adapter){target=\\_blank} section on the Local Development Node page.\n\n The configuration also defines two network settings: \n\n - **`localNode`**: Runs a PolkaVM instance on `http://127.0.0.1:8545` for local development and testing.\n - **`passetHub`**: Connects to the the Polkadot Hub TestNet network using a predefined RPC URL and a private key stored in environment variables.\n\n7. Export your private key and save it in your Hardhat environment:\n\n ```bash\n npx hardhat vars set PRIVATE_KEY \"INSERT_PRIVATE_KEY\"\n ```\n\n Replace `INSERT_PRIVATE_KEY` with your actual private key. \n \n For further details on private key exportation, refer to the article [How to export an account's private key](https://support.metamask.io/configure/accounts/how-to-export-an-accounts-private-key/){target=\\_blank}.\n\n !!! warning\n Keep your private key safe, and never share it with anyone. If it is compromised, your funds can be stolen."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 3, "depth": 2, "title": "Adding the Smart Contract", "anchor": "adding-the-smart-contract", "start_char": 3576, "end_char": 4951, "estimated_token_count": 293, "token_estimator": "heuristic-v1", "text": "## Adding the Smart Contract\n\n1. Create a new folder called `contracts` and create a `Storage.sol` file. Add the contract code from the previous tutorial:\n\n ```solidity title=\"Storage.sol\"\n // SPDX-License-Identifier: MIT\n pragma solidity ^0.8.28;\n\n contract Storage {\n // State variable to store our number\n uint256 private number;\n\n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n\n // Function to store a new number\n function store(uint256 newNumber) public {\n number = newNumber;\n emit NumberChanged(newNumber);\n }\n\n // Function to retrieve the stored number\n function retrieve() public view returns (uint256) {\n return number;\n }\n }\n ```\n\n2. Compile the contract:\n\n ```bash\n npx hardhat compile\n ```\n\n3. If successful, you will see the following output in your terminal:\n\n
\n npx hardhat compile\n Compiling 1 Solidity file\n Successfully compiled 1 Solidity file\n
\n\nAfter compilation, the `artifacts-pvm` and `cache-pvm` folders, containing the metadata and binary files of your compiled contract, will be created in the root of your project."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 4, "depth": 2, "title": "Writing Tests", "anchor": "writing-tests", "start_char": 4951, "end_char": 11980, "estimated_token_count": 1480, "token_estimator": "heuristic-v1", "text": "## Writing Tests\n\nTesting is a critical part of smart contract development. Hardhat makes it easy to write tests in JavaScript using frameworks like [Mocha](https://mochajs.org/){target=\\_blank} and [Chai](https://www.chaijs.com/){target=\\_blank}.\n\n1. Create a folder for testing called `test`. Inside that directory, create a file named `Storage.js` and add the following code:\n\n ```javascript title=\"Storage.js\" \n const { expect } = require('chai');\n const { ethers } = require('hardhat');\n\n describe('Storage', function () {\n let storage;\n let owner;\n let addr1;\n\n beforeEach(async function () {\n // Get signers\n [owner, addr1] = await ethers.getSigners();\n\n // Deploy the Storage contract\n const Storage = await ethers.getContractFactory('Storage');\n storage = await Storage.deploy();\n await storage.waitForDeployment();\n });\n\n describe('Basic functionality', function () {\n // Add your logic here\n });\n });\n ```\n\n The `beforeEach` hook ensures stateless contract execution by redeploying a fresh instance of the Storage contract before each test case. This approach guarantees that each test starts with a clean and independent contract state by using `ethers.getSigners()` to obtain test accounts and `ethers.getContractFactory('Storage').deploy()` to create a new contract instance.\n\n Now, you can add custom unit tests to check your contract functionality. Some example tests are available below:\n\n 1. **Initial state verification**: Ensures that the contract starts with a default value of zero, which is a fundamental expectation for the `Storage.sol` contract.\n\n ```javascript title=\"Storage.js\"\n it('Should return 0 initially', async function () {\n expect(await storage.retrieve()).to.equal(0);\n });\n ```\n\n Explanation:\n\n - Checks the initial state of the contract.\n - Verifies that a newly deployed contract has a default value of 0.\n - Confirms the `retrieve()` method works correctly for a new contract.\n\n 2. **Value storage test**: Validate the core functionality of storing and retrieving a value in the contract.\n\n ```javascript title=\"Storage.js\"\n it('Should update when store is called', async function () {\n const testValue = 42;\n // Store a value\n await storage.store(testValue);\n // Check if the value was updated\n expect(await storage.retrieve()).to.equal(testValue);\n });\n ```\n\n Explanation:\n\n - Demonstrates the ability to store a specific value.\n - Checks that the stored value can be retrieved correctly.\n - Verifies the basic write and read functionality of the contract.\n\n 3. **Event emission verification**: Confirm that the contract emits the correct event when storing a value, which is crucial for off-chain tracking.\n\n ```javascript title=\"Storage.js\"\n it('Should emit an event when storing a value', async function () {\n const testValue = 100;\n // Check if the NumberChanged event is emitted with the correct value\n await expect(storage.store(testValue))\n .to.emit(storage, 'NumberChanged')\n .withArgs(testValue);\n });\n ```\n\n Explanation:\n\n - Ensures the `NumberChanged` event is emitted during storage.\n - Verifies that the event contains the correct stored value.\n - Validates the contract's event logging mechanism.\n\n 4. **Sequential value storage test**: Check the contract's ability to store multiple values sequentially and maintain the most recent value.\n\n ```javascript title=\"Storage.js\"\n it('Should allow storing sequentially increasing values', async function () {\n const values = [10, 20, 30, 40];\n\n for (const value of values) {\n await storage.store(value);\n expect(await storage.retrieve()).to.equal(value);\n }\n });\n ```\n\n Explanation:\n\n - Verifies that multiple values can be stored in sequence.\n - Confirms that each new store operation updates the contract's state.\n - Demonstrates the contract's ability always to reflect the most recently stored value.\n\n The complete `test/Storage.js` should look like this:\n\n ???--- code \"View complete script\"\n ```javascript title=\"Storage.js\"\n const { expect } = require('chai');\n const { ethers } = require('hardhat');\n\n describe('Storage', function () {\n let storage;\n let owner;\n let addr1;\n\n beforeEach(async function () {\n // Get signers\n [owner, addr1] = await ethers.getSigners();\n\n // Deploy the Storage contract\n const Storage = await ethers.getContractFactory('Storage');\n storage = await Storage.deploy();\n await storage.waitForDeployment();\n });\n\n describe('Basic functionality', function () {\n it('Should return 0 initially', async function () {\n expect(await storage.retrieve()).to.equal(0);\n });\n\n it('Should update when store is called', async function () {\n const testValue = 42;\n // Store a value\n await storage.store(testValue);\n // Check if the value was updated\n expect(await storage.retrieve()).to.equal(testValue);\n });\n\n it('Should emit an event when storing a value', async function () {\n const testValue = 100;\n // Check if the NumberChanged event is emitted with the correct value\n await expect(storage.store(testValue))\n .to.emit(storage, 'NumberChanged')\n .withArgs(testValue);\n });\n\n it('Should allow storing sequentially increasing values', async function () {\n const values = [10, 20, 30, 40];\n\n for (const value of values) {\n await storage.store(value);\n expect(await storage.retrieve()).to.equal(value);\n }\n });\n });\n });\n ```\n\n2. Run the tests:\n\n ```bash\n npx hardhat test\n ```\n\n3. After running the above command, you will see the output showing that all tests have passed:\n\n
\n npx hardhat test\n Storage\n Basic functionality\n ✔ Should return 0 initially\n ✔ Should update when store is called (1126ms)\n ✔ Should emit an event when storing a value (1131ms)\n ✔ Should allow storing sequentially increasing values (12477ms)\n 4 passing (31s) \n
"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 5, "depth": 2, "title": "Deploying with Ignition", "anchor": "deploying-with-ignition", "start_char": 11980, "end_char": 14983, "estimated_token_count": 731, "token_estimator": "heuristic-v1", "text": "## Deploying with Ignition\n\n[Hardhat's Ignition](https://hardhat.org/ignition/docs/getting-started#overview){target=\\_blank} is a deployment system designed to make deployments predictable and manageable. Let's create a deployment script:\n\n1. Create a new folder called`ignition/modules`. Add a new file named `StorageModule.js` with the following logic:\n\n ```javascript title=\"StorageModule.js\"\n \n ```\n\n2. Deploy to the local network:\n\n 1. First, start a local node:\n\n ```bash\n npx hardhat node\n ```\n\n 2. Then, in a new terminal window, deploy the contract:\n\n ```bash\n npx hardhat ignition deploy ./ignition/modules/StorageModule.js --network localNode\n ```\n\n 3. If successful, output similar to the following will display in your terminal:\n\n
\n npx hardhat ignition deploy ./ignition/modules/Storage.js --network localNode\n ✔ Confirm deploy to network localNode (420420422)? … yes\n \n Hardhat Ignition 🚀\n \n Deploying [ StorageModule ]\n \n Batch #1\n Executed StorageModule#Storage\n \n [ StorageModule ] successfully deployed 🚀\n \n Deployed Addresses\n \n StorageModule#Storage - 0xc01Ee7f10EA4aF4673cFff62710E1D7792aBa8f3\n
\n\n3. Deploy to the Polkadot Hub TestNet:\n\n 1. Make sure your account has enough PAS tokens for gas fees, then run:\n\n ```bash\n npx hardhat ignition deploy ./ignition/modules/StorageModule.js --network passetHub\n ```\n\n 2. After deployment, you'll see the contract address in the console output. Save this address for future interactions.\n\n
\n npx hardhat ignition deploy ./ignition/modules/Storage.js --network passetHub\n ✔ Confirm deploy to network localNode (420420422)? … yes\n \n Hardhat Ignition 🚀\n \n Deploying [ StorageModule ]\n \n Batch #1\n Executed StorageModule#Storage\n \n [ StorageModule ] successfully deployed 🚀\n \n Deployed Addresses\n \n StorageModule#Storage - 0xE8693cE64b294E26765573398C7Ca5C700E9C85c\n
"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 6, "depth": 2, "title": "Interacting with Your Deployed Contract", "anchor": "interacting-with-your-deployed-contract", "start_char": 14983, "end_char": 16937, "estimated_token_count": 460, "token_estimator": "heuristic-v1", "text": "## Interacting with Your Deployed Contract\n\nTo interact with your deployed contract:\n\n1. Create a new folder named `scripts` and add the `interact.js` with the following content:\n\n ```javascript title=\"interact.js\"\n const hre = require('hardhat');\n\n async function main() {\n // Replace with your deployed contract address\n const contractAddress = 'INSERT_DEPLOYED_CONTRACT_ADDRESS';\n\n // Get the contract instance\n const Storage = await hre.ethers.getContractFactory('Storage');\n const storage = await Storage.attach(contractAddress);\n\n // Get current value\n const currentValue = await storage.retrieve();\n console.log('Current stored value:', currentValue.toString());\n\n // Store a new value\n const newValue = 42;\n console.log(`Storing new value: ${newValue}...`);\n const tx = await storage.store(newValue);\n\n // Wait for transaction to be mined\n await tx.wait();\n console.log('Transaction confirmed');\n\n // Get updated value\n const updatedValue = await storage.retrieve();\n console.log('Updated stored value:', updatedValue.toString());\n }\n\n main()\n .then(() => process.exit(0))\n .catch((error) => {\n console.error(error);\n process.exit(1);\n });\n ```\n\n Ensure that `INSERT_DEPLOYED_CONTRACT_ADDRESS` is replaced with the value obtained in the previous step.\n\n2. Run the interaction script:\n\n ```bash\n npx hardhat run scripts/interact.js --network passetHub\n ```\n\n3. If successful, the terminal will show the following output:\n\n
\n npx hardhat run scripts/interact.js --network passetHub\n Current stored value: 0\n Storing new value: 42...\n Transaction confirmed\n Updated stored value: 42\n
"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 7, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 16937, "end_char": 17542, "estimated_token_count": 122, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've successfully set up a Hardhat development environment, written comprehensive tests for your Storage contract, and deployed it to local and Polkadot Hub TestNet networks. This tutorial covered essential steps in smart contract development, including configuration, testing, deployment, and interaction.\n\nTo get started with a working example right away, you can clone the repository and navigate to the project directory:\n\n```bash\ngit clone https://github.com/polkadot-developers/polkavm-hardhat-examples.git -b v0.0.8\ncd polkavm-hardhat-examples/storage-hardhat\n```"} {"page_id": "tutorials-smart-contracts-launch-your-first-project", "page_title": "Launch Your First Project", "index": 0, "depth": 2, "title": "Development Pathway", "anchor": "development-pathway", "start_char": 873, "end_char": 1162, "estimated_token_count": 65, "token_estimator": "heuristic-v1", "text": "## Development Pathway\n\n- **Beginner-friendly**: Step-by-step instructions suitable for newcomers to smart contract development.\n- **Hands-on learning**: Practical exercises that build real-world skills.\n- **Production-ready**: Progress from basic concepts to deployment-ready contracts."} {"page_id": "tutorials-smart-contracts-launch-your-first-project", "page_title": "Launch Your First Project", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1162, "end_char": 1211, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "tutorials-smart-contracts", "page_title": "Smart Contracts", "index": 0, "depth": 2, "title": "What to Expect from These Tutorials", "anchor": "what-to-expect-from-these-tutorials", "start_char": 440, "end_char": 739, "estimated_token_count": 67, "token_estimator": "heuristic-v1", "text": "## What to Expect from These Tutorials\n\n- **Beginner to advanced**: Suitable for developers of all levels.\n- **Complete workflows**: Covers the entire process from writing code to on-chain deployment.\n- **Interactive examples**: Follow along with real, working code that you can modify and expand."} {"page_id": "tutorials-smart-contracts", "page_title": "Smart Contracts", "index": 1, "depth": 2, "title": "Start Building", "anchor": "start-building", "start_char": 739, "end_char": 1008, "estimated_token_count": 51, "token_estimator": "heuristic-v1", "text": "## Start Building\n\nJump into the tutorials and learn how to:\n\n- Write and compile smart contracts.\n- Deploy contracts to the Polkadot network.\n- Interact with deployed contracts using libraries like Ethers.js and viem.\n\nChoose a tutorial below and start coding today!"} {"page_id": "tutorials-smart-contracts", "page_title": "Smart Contracts", "index": 2, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1008, "end_char": 1057, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "tutorials", "page_title": "Tutorials", "index": 0, "depth": 2, "title": "Polkadot Zero to Hero", "anchor": "polkadot-zero-to-hero", "start_char": 326, "end_char": 452, "estimated_token_count": 25, "token_estimator": "heuristic-v1", "text": "## Polkadot Zero to Hero\n\nThe Zero to Hero series offers step-by-step guidance to development across the Polkadot ecosystem."} -{"page_id": "tutorials", "page_title": "Tutorials", "index": 1, "depth": 3, "title": "Parachain Developers", "anchor": "parachain-developers", "start_char": 452, "end_char": 948, "estimated_token_count": 135, "token_estimator": "heuristic-v1", "text": "### Parachain Developers\n\n"} -{"page_id": "tutorials", "page_title": "Tutorials", "index": 2, "depth": 2, "title": "Featured Tutorials", "anchor": "featured-tutorials", "start_char": 948, "end_char": 2449, "estimated_token_count": 422, "token_estimator": "heuristic-v1", "text": "## Featured Tutorials\n\n"} -{"page_id": "tutorials", "page_title": "Tutorials", "index": 3, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 2449, "end_char": 2498, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} +{"page_id": "tutorials", "page_title": "Tutorials", "index": 1, "depth": 3, "title": "Parachain Developers", "anchor": "parachain-developers", "start_char": 452, "end_char": 937, "estimated_token_count": 132, "token_estimator": "heuristic-v1", "text": "### Parachain Developers\n\n"} +{"page_id": "tutorials", "page_title": "Tutorials", "index": 2, "depth": 2, "title": "Featured Tutorials", "anchor": "featured-tutorials", "start_char": 937, "end_char": 2394, "estimated_token_count": 410, "token_estimator": "heuristic-v1", "text": "## Featured Tutorials\n\n"} +{"page_id": "tutorials", "page_title": "Tutorials", "index": 3, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 2394, "end_char": 2443, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} From 52310a524884cb01a8ae64b62ef03628f6af7620 Mon Sep 17 00:00:00 2001 From: 0xlukem Date: Mon, 27 Oct 2025 11:09:42 -0300 Subject: [PATCH 3/3] fresh llms --- .ai/categories/basics.md | 195 ++- .ai/categories/dapps.md | 201 +++- .ai/categories/infrastructure.md | 240 +++- .ai/categories/networks.md | 201 +++- .ai/categories/parachains.md | 364 +++++- .ai/categories/polkadot-protocol.md | 201 +++- .ai/categories/reference.md | 6 +- .ai/categories/smart-contracts.md | 419 ++++++- .ai/categories/tooling.md | 1062 ++++++++++++++++- .../develop-interoperability-send-messages.md | 3 +- ...develop-interoperability-test-and-debug.md | 56 +- ...velop-interoperability-xcm-runtime-apis.md | 6 +- ...-deployment-build-deterministic-runtime.md | 24 +- ...rachains-maintenance-storage-migrations.md | 106 +- ...develop-parachains-testing-benchmarking.md | 35 +- ...s-precompiles-interact-with-precompiles.md | 218 ++++ ...arding-and-offboarding-start-validating.md | 39 +- ...adot-protocol-parachain-basics-accounts.md | 30 +- ...ins-zero-to-hero-add-pallets-to-runtime.md | 62 +- ...nch-your-first-project-create-contracts.md | 44 +- ...our-first-project-create-dapp-ethers-js.md | 478 +++++++- ...nch-your-first-project-create-dapp-viem.md | 362 +++++- ...st-project-test-and-deploy-with-hardhat.md | 45 +- .ai/site-index.json | 538 ++++----- llms-full.jsonl | 146 +-- 25 files changed, 4548 insertions(+), 533 deletions(-) diff --git a/.ai/categories/basics.md b/.ai/categories/basics.md index 3abb8916f..0c412865b 100644 --- a/.ai/categories/basics.md +++ b/.ai/categories/basics.md @@ -140,7 +140,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - + [dependencies] ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -159,11 +159,9 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 3. In the `[features]` section, add the custom pallet to the `std` feature list: ```toml hl_lines="5" title="Cargo.toml" - [features] - default = ["std"] - std = [ + ... - "custom-pallet/std", + ... ] ``` @@ -287,13 +285,63 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - + [workspace.package] + license = "MIT-0" + authors = ["Parity Technologies "] + homepage = "https://paritytech.github.io/polkadot-sdk/" + repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" + edition = "2021" + + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] + resolver = "2" + + [workspace.dependencies] + parachain-template-runtime = { path = "./runtime", default-features = false } + pallet-parachain-template = { path = "./pallets/template", default-features = false } + clap = { version = "4.5.13" } + color-print = { version = "0.3.4" } + docify = { version = "0.2.9" } + futures = { version = "0.3.31" } + jsonrpsee = { version = "0.24.3" } + log = { version = "0.4.22", default-features = false } + polkadot-sdk = { version = "2503.0.1", default-features = false } + prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } + serde = { version = "1.0.214", default-features = false } + codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } + cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } + hex-literal = { version = "0.4.1", default-features = false } + scale-info = { version = "2.11.6", default-features = false } + serde_json = { version = "1.0.132", default-features = false } + smallvec = { version = "1.11.0", default-features = false } + substrate-wasm-builder = { version = "26.0.1", default-features = false } + frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } + + [profile.release] + opt-level = 3 + panic = "unwind" + + [profile.production] + codegen-units = 1 + inherits = "release" + lto = true ``` @@ -1634,13 +1682,53 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ## Understanding the Code @@ -4928,7 +5016,16 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - + /// The full account information for a particular account ID. + #[pallet::storage] + #[pallet::getter(fn account)] + pub type Account = StorageMap< + _, + Blake2_128Concat, + T::AccountId, + AccountInfo, + ValueQuery, + >; ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -4952,7 +5049,24 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs - +/// Information of an account. +#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] +pub struct AccountInfo { + /// The number of transactions this account has sent. + pub nonce: Nonce, + /// The number of other modules that currently depend on this account's existence. The account + /// cannot be reaped until this is zero. + pub consumers: RefCount, + /// The number of other modules that allow this account to exist. The account may not be reaped + /// until this and `sufficients` are both zero. + pub providers: RefCount, + /// The number of modules that allow this account to exist for their own purposes only. The + /// account may not be reaped until this and `providers` are both zero. + pub sufficients: RefCount, + /// The additional data that belongs to this account. Used to store the balance(s) in a lot of + /// chains. + pub data: AccountData, +} ``` The `AccountInfo` structure includes the following components: @@ -5663,7 +5777,8 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust - +pub type PriceForChildParachainDelivery = + ExponentialPrice; ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -6485,19 +6600,69 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - + decl_test_parachains! { + pub struct AssetHubWestend { + genesis = genesis::genesis(), + on_init = { + asset_hub_westend_runtime::AuraExt::on_initialize(1); + }, + runtime = asset_hub_westend_runtime, + core = { + XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, + LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, + ParachainInfo: asset_hub_westend_runtime::ParachainInfo, + MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, + DigestProvider: (), + }, + pallets = { + PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, + Balances: asset_hub_westend_runtime::Balances, + Assets: asset_hub_westend_runtime::Assets, + ForeignAssets: asset_hub_westend_runtime::ForeignAssets, + PoolAssets: asset_hub_westend_runtime::PoolAssets, + AssetConversion: asset_hub_westend_runtime::AssetConversion, + SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, + Revive: asset_hub_westend_runtime::Revive, + } + }, + } ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - + decl_test_bridges! { + pub struct RococoWestendMockBridge { + source = BridgeHubRococoPara, + target = BridgeHubWestendPara, + handler = RococoWestendMessageHandler + }, + pub struct WestendRococoMockBridge { + source = BridgeHubWestendPara, + target = BridgeHubRococoPara, + handler = WestendRococoMessageHandler + } + } ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - + decl_test_networks! { + pub struct WestendMockNet { + relay_chain = Westend, + parachains = vec![ + AssetHubWestend, + BridgeHubWestend, + CollectivesWestend, + CoretimeWestend, + PeopleWestend, + PenpalA, + PenpalB, + ], + bridge = () + }, + } ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. diff --git a/.ai/categories/dapps.md b/.ai/categories/dapps.md index 098dc2577..eaaf28a4f 100644 --- a/.ai/categories/dapps.md +++ b/.ai/categories/dapps.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - + [dependencies] ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -160,11 +160,9 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 3. In the `[features]` section, add the custom pallet to the `std` feature list: ```toml hl_lines="5" title="Cargo.toml" - [features] - default = ["std"] - std = [ + ... - "custom-pallet/std", + ... ] ``` @@ -288,13 +286,63 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - + [workspace.package] + license = "MIT-0" + authors = ["Parity Technologies "] + homepage = "https://paritytech.github.io/polkadot-sdk/" + repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" + edition = "2021" + + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] + resolver = "2" + + [workspace.dependencies] + parachain-template-runtime = { path = "./runtime", default-features = false } + pallet-parachain-template = { path = "./pallets/template", default-features = false } + clap = { version = "4.5.13" } + color-print = { version = "0.3.4" } + docify = { version = "0.2.9" } + futures = { version = "0.3.31" } + jsonrpsee = { version = "0.24.3" } + log = { version = "0.4.22", default-features = false } + polkadot-sdk = { version = "2503.0.1", default-features = false } + prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } + serde = { version = "1.0.214", default-features = false } + codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } + cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } + hex-literal = { version = "0.4.1", default-features = false } + scale-info = { version = "2.11.6", default-features = false } + serde_json = { version = "1.0.132", default-features = false } + smallvec = { version = "1.11.0", default-features = false } + substrate-wasm-builder = { version = "26.0.1", default-features = false } + frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } + + [profile.release] + opt-level = 3 + panic = "unwind" + + [profile.production] + codegen-units = 1 + inherits = "release" + lto = true ``` @@ -1994,13 +2042,53 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ## Understanding the Code @@ -7285,7 +7373,16 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - + /// The full account information for a particular account ID. + #[pallet::storage] + #[pallet::getter(fn account)] + pub type Account = StorageMap< + _, + Blake2_128Concat, + T::AccountId, + AccountInfo, + ValueQuery, + >; ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -7309,7 +7406,24 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs - +/// Information of an account. +#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] +pub struct AccountInfo { + /// The number of transactions this account has sent. + pub nonce: Nonce, + /// The number of other modules that currently depend on this account's existence. The account + /// cannot be reaped until this is zero. + pub consumers: RefCount, + /// The number of other modules that allow this account to exist. The account may not be reaped + /// until this and `sufficients` are both zero. + pub providers: RefCount, + /// The number of modules that allow this account to exist for their own purposes only. The + /// account may not be reaped until this and `providers` are both zero. + pub sufficients: RefCount, + /// The additional data that belongs to this account. Used to store the balance(s) in a lot of + /// chains. + pub data: AccountData, +} ``` The `AccountInfo` structure includes the following components: @@ -8693,7 +8807,8 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust - +pub type PriceForChildParachainDelivery = + ExponentialPrice; ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -9883,19 +9998,69 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - + decl_test_parachains! { + pub struct AssetHubWestend { + genesis = genesis::genesis(), + on_init = { + asset_hub_westend_runtime::AuraExt::on_initialize(1); + }, + runtime = asset_hub_westend_runtime, + core = { + XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, + LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, + ParachainInfo: asset_hub_westend_runtime::ParachainInfo, + MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, + DigestProvider: (), + }, + pallets = { + PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, + Balances: asset_hub_westend_runtime::Balances, + Assets: asset_hub_westend_runtime::Assets, + ForeignAssets: asset_hub_westend_runtime::ForeignAssets, + PoolAssets: asset_hub_westend_runtime::PoolAssets, + AssetConversion: asset_hub_westend_runtime::AssetConversion, + SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, + Revive: asset_hub_westend_runtime::Revive, + } + }, + } ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - + decl_test_bridges! { + pub struct RococoWestendMockBridge { + source = BridgeHubRococoPara, + target = BridgeHubWestendPara, + handler = RococoWestendMessageHandler + }, + pub struct WestendRococoMockBridge { + source = BridgeHubWestendPara, + target = BridgeHubRococoPara, + handler = WestendRococoMessageHandler + } + } ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - + decl_test_networks! { + pub struct WestendMockNet { + relay_chain = Westend, + parachains = vec![ + AssetHubWestend, + BridgeHubWestend, + CollectivesWestend, + CoretimeWestend, + PeopleWestend, + PenpalA, + PenpalB, + ], + bridge = () + }, + } ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -11173,7 +11338,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust - +fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; ``` ??? interface "Input parameters" @@ -11450,7 +11615,7 @@ This API allows a dry-run of any extrinsic and obtaining the outcome if it fails This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust - +fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; ``` ??? interface "Input parameters" @@ -11681,7 +11846,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust - +fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; ``` ??? interface "Input parameters" diff --git a/.ai/categories/infrastructure.md b/.ai/categories/infrastructure.md index 2588c71da..1c5500789 100644 --- a/.ai/categories/infrastructure.md +++ b/.ai/categories/infrastructure.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - + [dependencies] ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -160,11 +160,9 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 3. In the `[features]` section, add the custom pallet to the `std` feature list: ```toml hl_lines="5" title="Cargo.toml" - [features] - default = ["std"] - std = [ + ... - "custom-pallet/std", + ... ] ``` @@ -288,13 +286,63 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - + [workspace.package] + license = "MIT-0" + authors = ["Parity Technologies "] + homepage = "https://paritytech.github.io/polkadot-sdk/" + repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" + edition = "2021" + + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] + resolver = "2" + + [workspace.dependencies] + parachain-template-runtime = { path = "./runtime", default-features = false } + pallet-parachain-template = { path = "./pallets/template", default-features = false } + clap = { version = "4.5.13" } + color-print = { version = "0.3.4" } + docify = { version = "0.2.9" } + futures = { version = "0.3.31" } + jsonrpsee = { version = "0.24.3" } + log = { version = "0.4.22", default-features = false } + polkadot-sdk = { version = "2503.0.1", default-features = false } + prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } + serde = { version = "1.0.214", default-features = false } + codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } + cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } + hex-literal = { version = "0.4.1", default-features = false } + scale-info = { version = "2.11.6", default-features = false } + serde_json = { version = "1.0.132", default-features = false } + smallvec = { version = "1.11.0", default-features = false } + substrate-wasm-builder = { version = "26.0.1", default-features = false } + frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } + + [profile.release] + opt-level = 3 + panic = "unwind" + + [profile.production] + codegen-units = 1 + inherits = "release" + lto = true ``` @@ -1635,13 +1683,53 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ## Understanding the Code @@ -7116,7 +7204,16 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - + /// The full account information for a particular account ID. + #[pallet::storage] + #[pallet::getter(fn account)] + pub type Account = StorageMap< + _, + Blake2_128Concat, + T::AccountId, + AccountInfo, + ValueQuery, + >; ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -7140,7 +7237,24 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs - +/// Information of an account. +#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] +pub struct AccountInfo { + /// The number of transactions this account has sent. + pub nonce: Nonce, + /// The number of other modules that currently depend on this account's existence. The account + /// cannot be reaped until this is zero. + pub consumers: RefCount, + /// The number of other modules that allow this account to exist. The account may not be reaped + /// until this and `sufficients` are both zero. + pub providers: RefCount, + /// The number of modules that allow this account to exist for their own purposes only. The + /// account may not be reaped until this and `providers` are both zero. + pub sufficients: RefCount, + /// The additional data that belongs to this account. Used to store the balance(s) in a lot of + /// chains. + pub data: AccountData, +} ``` The `AccountInfo` structure includes the following components: @@ -8058,7 +8172,8 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust - +pub type PriceForChildParachainDelivery = + ExponentialPrice; ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -9804,7 +9919,44 @@ touch /etc/systemd/system/polkadot-validator.service In this unit file, you will write the commands that you want to run on server boot/restart: ```systemd title="/etc/systemd/system/polkadot-validator.service" - +[Unit] +Description=Polkadot Node +After=network.target +Documentation=https://github.com/paritytech/polkadot-sdk + +[Service] +EnvironmentFile=-/etc/default/polkadot +ExecStart=/usr/bin/polkadot $POLKADOT_CLI_ARGS +User=polkadot +Group=polkadot +Restart=always +RestartSec=120 +CapabilityBoundingSet= +LockPersonality=true +NoNewPrivileges=true +PrivateDevices=true +PrivateMounts=true +PrivateTmp=true +PrivateUsers=true +ProtectClock=true +ProtectControlGroups=true +ProtectHostname=true +ProtectKernelModules=true +ProtectKernelTunables=true +ProtectSystem=strict +RemoveIPC=true +RestrictAddressFamilies=AF_INET AF_INET6 AF_NETLINK AF_UNIX +RestrictNamespaces=false +RestrictSUIDSGID=true +SystemCallArchitectures=native +SystemCallFilter=@system-service +SystemCallFilter=landlock_add_rule landlock_create_ruleset landlock_restrict_self seccomp mount umount2 +SystemCallFilter=~@clock @module @reboot @swap @privileged +SystemCallFilter=pivot_root +UMask=0027 + +[Install] +WantedBy=multi-user.target ``` !!! warning "Restart delay and equivocation risk" @@ -9969,19 +10121,69 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - + decl_test_parachains! { + pub struct AssetHubWestend { + genesis = genesis::genesis(), + on_init = { + asset_hub_westend_runtime::AuraExt::on_initialize(1); + }, + runtime = asset_hub_westend_runtime, + core = { + XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, + LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, + ParachainInfo: asset_hub_westend_runtime::ParachainInfo, + MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, + DigestProvider: (), + }, + pallets = { + PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, + Balances: asset_hub_westend_runtime::Balances, + Assets: asset_hub_westend_runtime::Assets, + ForeignAssets: asset_hub_westend_runtime::ForeignAssets, + PoolAssets: asset_hub_westend_runtime::PoolAssets, + AssetConversion: asset_hub_westend_runtime::AssetConversion, + SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, + Revive: asset_hub_westend_runtime::Revive, + } + }, + } ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - + decl_test_bridges! { + pub struct RococoWestendMockBridge { + source = BridgeHubRococoPara, + target = BridgeHubWestendPara, + handler = RococoWestendMessageHandler + }, + pub struct WestendRococoMockBridge { + source = BridgeHubWestendPara, + target = BridgeHubRococoPara, + handler = WestendRococoMessageHandler + } + } ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - + decl_test_networks! { + pub struct WestendMockNet { + relay_chain = Westend, + parachains = vec![ + AssetHubWestend, + BridgeHubWestend, + CollectivesWestend, + CoretimeWestend, + PeopleWestend, + PenpalA, + PenpalB, + ], + bridge = () + }, + } ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -11490,7 +11692,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust - +fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; ``` ??? interface "Input parameters" @@ -11767,7 +11969,7 @@ This API allows a dry-run of any extrinsic and obtaining the outcome if it fails This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust - +fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; ``` ??? interface "Input parameters" @@ -11998,7 +12200,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust - +fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; ``` ??? interface "Input parameters" diff --git a/.ai/categories/networks.md b/.ai/categories/networks.md index 6bbf52aec..92f68ec16 100644 --- a/.ai/categories/networks.md +++ b/.ai/categories/networks.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - + [dependencies] ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -160,11 +160,9 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 3. In the `[features]` section, add the custom pallet to the `std` feature list: ```toml hl_lines="5" title="Cargo.toml" - [features] - default = ["std"] - std = [ + ... - "custom-pallet/std", + ... ] ``` @@ -288,13 +286,63 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - + [workspace.package] + license = "MIT-0" + authors = ["Parity Technologies "] + homepage = "https://paritytech.github.io/polkadot-sdk/" + repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" + edition = "2021" + + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] + resolver = "2" + + [workspace.dependencies] + parachain-template-runtime = { path = "./runtime", default-features = false } + pallet-parachain-template = { path = "./pallets/template", default-features = false } + clap = { version = "4.5.13" } + color-print = { version = "0.3.4" } + docify = { version = "0.2.9" } + futures = { version = "0.3.31" } + jsonrpsee = { version = "0.24.3" } + log = { version = "0.4.22", default-features = false } + polkadot-sdk = { version = "2503.0.1", default-features = false } + prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } + serde = { version = "1.0.214", default-features = false } + codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } + cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } + hex-literal = { version = "0.4.1", default-features = false } + scale-info = { version = "2.11.6", default-features = false } + serde_json = { version = "1.0.132", default-features = false } + smallvec = { version = "1.11.0", default-features = false } + substrate-wasm-builder = { version = "26.0.1", default-features = false } + frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } + + [profile.release] + opt-level = 3 + panic = "unwind" + + [profile.production] + codegen-units = 1 + inherits = "release" + lto = true ``` @@ -1635,13 +1683,53 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ## Understanding the Code @@ -6168,7 +6256,16 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - + /// The full account information for a particular account ID. + #[pallet::storage] + #[pallet::getter(fn account)] + pub type Account = StorageMap< + _, + Blake2_128Concat, + T::AccountId, + AccountInfo, + ValueQuery, + >; ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -6192,7 +6289,24 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs - +/// Information of an account. +#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] +pub struct AccountInfo { + /// The number of transactions this account has sent. + pub nonce: Nonce, + /// The number of other modules that currently depend on this account's existence. The account + /// cannot be reaped until this is zero. + pub consumers: RefCount, + /// The number of other modules that allow this account to exist. The account may not be reaped + /// until this and `sufficients` are both zero. + pub providers: RefCount, + /// The number of modules that allow this account to exist for their own purposes only. The + /// account may not be reaped until this and `providers` are both zero. + pub sufficients: RefCount, + /// The additional data that belongs to this account. Used to store the balance(s) in a lot of + /// chains. + pub data: AccountData, +} ``` The `AccountInfo` structure includes the following components: @@ -6903,7 +7017,8 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust - +pub type PriceForChildParachainDelivery = + ExponentialPrice; ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -7725,19 +7840,69 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - + decl_test_parachains! { + pub struct AssetHubWestend { + genesis = genesis::genesis(), + on_init = { + asset_hub_westend_runtime::AuraExt::on_initialize(1); + }, + runtime = asset_hub_westend_runtime, + core = { + XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, + LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, + ParachainInfo: asset_hub_westend_runtime::ParachainInfo, + MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, + DigestProvider: (), + }, + pallets = { + PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, + Balances: asset_hub_westend_runtime::Balances, + Assets: asset_hub_westend_runtime::Assets, + ForeignAssets: asset_hub_westend_runtime::ForeignAssets, + PoolAssets: asset_hub_westend_runtime::PoolAssets, + AssetConversion: asset_hub_westend_runtime::AssetConversion, + SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, + Revive: asset_hub_westend_runtime::Revive, + } + }, + } ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - + decl_test_bridges! { + pub struct RococoWestendMockBridge { + source = BridgeHubRococoPara, + target = BridgeHubWestendPara, + handler = RococoWestendMessageHandler + }, + pub struct WestendRococoMockBridge { + source = BridgeHubWestendPara, + target = BridgeHubRococoPara, + handler = WestendRococoMessageHandler + } + } ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - + decl_test_networks! { + pub struct WestendMockNet { + relay_chain = Westend, + parachains = vec![ + AssetHubWestend, + BridgeHubWestend, + CollectivesWestend, + CoretimeWestend, + PeopleWestend, + PenpalA, + PenpalB, + ], + bridge = () + }, + } ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -8922,7 +9087,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust - +fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; ``` ??? interface "Input parameters" @@ -9199,7 +9364,7 @@ This API allows a dry-run of any extrinsic and obtaining the outcome if it fails This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust - +fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; ``` ??? interface "Input parameters" @@ -9430,7 +9595,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust - +fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; ``` ??? interface "Input parameters" diff --git a/.ai/categories/parachains.md b/.ai/categories/parachains.md index 1bdcf1ac0..740d079d4 100644 --- a/.ai/categories/parachains.md +++ b/.ai/categories/parachains.md @@ -538,7 +538,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - + [dependencies] ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -557,11 +557,9 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 3. In the `[features]` section, add the custom pallet to the `std` feature list: ```toml hl_lines="5" title="Cargo.toml" - [features] - default = ["std"] - std = [ + ... - "custom-pallet/std", + ... ] ``` @@ -685,13 +683,63 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - + [workspace.package] + license = "MIT-0" + authors = ["Parity Technologies "] + homepage = "https://paritytech.github.io/polkadot-sdk/" + repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" + edition = "2021" + + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] + resolver = "2" + + [workspace.dependencies] + parachain-template-runtime = { path = "./runtime", default-features = false } + pallet-parachain-template = { path = "./pallets/template", default-features = false } + clap = { version = "4.5.13" } + color-print = { version = "0.3.4" } + docify = { version = "0.2.9" } + futures = { version = "0.3.31" } + jsonrpsee = { version = "0.24.3" } + log = { version = "0.4.22", default-features = false } + polkadot-sdk = { version = "2503.0.1", default-features = false } + prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } + serde = { version = "1.0.214", default-features = false } + codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } + cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } + hex-literal = { version = "0.4.1", default-features = false } + scale-info = { version = "2.11.6", default-features = false } + serde_json = { version = "1.0.132", default-features = false } + smallvec = { version = "1.11.0", default-features = false } + substrate-wasm-builder = { version = "26.0.1", default-features = false } + frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } + + [profile.release] + opt-level = 3 + panic = "unwind" + + [profile.production] + codegen-units = 1 + inherits = "release" + lto = true ``` @@ -998,7 +1046,40 @@ my-pallet/ With the directory structure set, you can use the [`polkadot-sdk-parachain-template`](https://github.com/paritytech/polkadot-sdk-parachain-template/tree/master/pallets){target=\_blank} to get started as follows: ```rust title="benchmarking.rs (starter template)" +//! Benchmarking setup for pallet-template +#![cfg(feature = "runtime-benchmarks")] + +use super::*; +use frame_benchmarking::v2::*; +#[benchmarks] +mod benchmarks { + use super::*; + #[cfg(test)] + use crate::pallet::Pallet as Template; + use frame_system::RawOrigin; + + #[benchmark] + fn do_something() { + let caller: T::AccountId = whitelisted_caller(); + #[extrinsic_call] + do_something(RawOrigin::Signed(caller), 100); + + assert_eq!(Something::::get().map(|v| v.block_number), Some(100u32.into())); + } + + #[benchmark] + fn cause_error() { + Something::::put(CompositeStruct { block_number: 100u32.into() }); + let caller: T::AccountId = whitelisted_caller(); + #[extrinsic_call] + cause_error(RawOrigin::Signed(caller)); + + assert_eq!(Something::::get().map(|v| v.block_number), Some(101u32.into())); + } + + impl_benchmark_test_suite!(Template, crate::mock::new_test_ext(), crate::mock::Test); +} ``` In your benchmarking tests, employ these best practices: @@ -2031,7 +2112,29 @@ To add a GitHub workflow for building the runtime: {% raw %} ```yml - + name: Srtool build + + on: push + + jobs: + srtool: + runs-on: ubuntu-latest + strategy: + matrix: + chain: ["asset-hub-kusama", "asset-hub-westend"] + steps: + - uses: actions/checkout@v3 + - name: Srtool build + id: srtool_build + uses: chevdor/srtool-actions@v0.8.0 + with: + chain: ${{ matrix.chain }} + runtime_dir: polkadot-parachains/${{ matrix.chain }}-runtime + - name: Summary + run: | + echo '${{ steps.srtool_build.outputs.json }}' | jq . > ${{ matrix.chain }}-srtool-digest.json + cat ${{ matrix.chain }}-srtool-digest.json + echo "Runtime location: ${{ steps.srtool_build.outputs.wasm }}" ``` {% endraw %} @@ -2792,13 +2895,53 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ## Understanding the Code @@ -12315,7 +12458,16 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - + /// The full account information for a particular account ID. + #[pallet::storage] + #[pallet::getter(fn account)] + pub type Account = StorageMap< + _, + Blake2_128Concat, + T::AccountId, + AccountInfo, + ValueQuery, + >; ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -12339,7 +12491,24 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs - +/// Information of an account. +#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] +pub struct AccountInfo { + /// The number of transactions this account has sent. + pub nonce: Nonce, + /// The number of other modules that currently depend on this account's existence. The account + /// cannot be reaped until this is zero. + pub consumers: RefCount, + /// The number of other modules that allow this account to exist. The account may not be reaped + /// until this and `sufficients` are both zero. + pub providers: RefCount, + /// The number of modules that allow this account to exist for their own purposes only. The + /// account may not be reaped until this and `providers` are both zero. + pub sufficients: RefCount, + /// The additional data that belongs to this account. Used to store the balance(s) in a lot of + /// chains. + pub data: AccountData, +} ``` The `AccountInfo` structure includes the following components: @@ -13329,7 +13498,8 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust - +pub type PriceForChildParachainDelivery = + ExponentialPrice; ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -14244,7 +14414,111 @@ Examine the following migration example that transforms a simple `StorageValue` - Migration: ```rust - + use frame_support::{ + storage_alias, + traits::{Get, UncheckedOnRuntimeUpgrade}, + }; + + #[cfg(feature = "try-runtime")] + use alloc::vec::Vec; + + /// Collection of storage item formats from the previous storage version. + /// + /// Required so we can read values in the v0 storage format during the migration. + mod v0 { + use super::*; + + /// V0 type for [`crate::Value`]. + #[storage_alias] + pub type Value = StorageValue, u32>; + } + + /// Implements [`UncheckedOnRuntimeUpgrade`], migrating the state of this pallet from V0 to V1. + /// + /// In V0 of the template [`crate::Value`] is just a `u32`. In V1, it has been upgraded to + /// contain the struct [`crate::CurrentAndPreviousValue`]. + /// + /// In this migration, update the on-chain storage for the pallet to reflect the new storage + /// layout. + pub struct InnerMigrateV0ToV1(core::marker::PhantomData); + + impl UncheckedOnRuntimeUpgrade for InnerMigrateV0ToV1 { + /// Return the existing [`crate::Value`] so we can check that it was correctly set in + /// `InnerMigrateV0ToV1::post_upgrade`. + #[cfg(feature = "try-runtime")] + fn pre_upgrade() -> Result, sp_runtime::TryRuntimeError> { + use codec::Encode; + + // Access the old value using the `storage_alias` type + let old_value = v0::Value::::get(); + // Return it as an encoded `Vec` + Ok(old_value.encode()) + } + + /// Migrate the storage from V0 to V1. + /// + /// - If the value doesn't exist, there is nothing to do. + /// - If the value exists, it is read and then written back to storage inside a + /// [`crate::CurrentAndPreviousValue`]. + fn on_runtime_upgrade() -> frame_support::weights::Weight { + // Read the old value from storage + if let Some(old_value) = v0::Value::::take() { + // Write the new value to storage + let new = crate::CurrentAndPreviousValue { current: old_value, previous: None }; + crate::Value::::put(new); + // One read + write for taking the old value, and one write for setting the new value + T::DbWeight::get().reads_writes(1, 2) + } else { + // No writes since there was no old value, just one read for checking + T::DbWeight::get().reads(1) + } + } + + /// Verifies the storage was migrated correctly. + /// + /// - If there was no old value, the new value should not be set. + /// - If there was an old value, the new value should be a [`crate::CurrentAndPreviousValue`]. + #[cfg(feature = "try-runtime")] + fn post_upgrade(state: Vec) -> Result<(), sp_runtime::TryRuntimeError> { + use codec::Decode; + use frame_support::ensure; + + let maybe_old_value = Option::::decode(&mut &state[..]).map_err(|_| { + sp_runtime::TryRuntimeError::Other("Failed to decode old value from storage") + })?; + + match maybe_old_value { + Some(old_value) => { + let expected_new_value = + crate::CurrentAndPreviousValue { current: old_value, previous: None }; + let actual_new_value = crate::Value::::get(); + + ensure!(actual_new_value.is_some(), "New value not set"); + ensure!( + actual_new_value == Some(expected_new_value), + "New value not set correctly" + ); + }, + None => { + ensure!(crate::Value::::get().is_none(), "New value unexpectedly set"); + }, + }; + Ok(()) + } + } + + /// [`UncheckedOnRuntimeUpgrade`] implementation [`InnerMigrateV0ToV1`] wrapped in a + /// [`VersionedMigration`](frame_support::migrations::VersionedMigration), which ensures that: + /// - The migration only runs once when the on-chain storage version is 0 + /// - The on-chain storage version is updated to `1` after the migration executes + /// - Reads/Writes from checking/settings the on-chain storage version are accounted for + pub type MigrateV0ToV1 = frame_support::migrations::VersionedMigration< + 0, // The migration will only execute when the on-chain storage version is 0 + 1, // The on-chain storage version will be set to 1 after the migration is complete + InnerMigrateV0ToV1, + crate::pallet::Pallet, + ::DbWeight, + >; ``` ### Migration Organization @@ -14401,19 +14675,69 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - + decl_test_parachains! { + pub struct AssetHubWestend { + genesis = genesis::genesis(), + on_init = { + asset_hub_westend_runtime::AuraExt::on_initialize(1); + }, + runtime = asset_hub_westend_runtime, + core = { + XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, + LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, + ParachainInfo: asset_hub_westend_runtime::ParachainInfo, + MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, + DigestProvider: (), + }, + pallets = { + PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, + Balances: asset_hub_westend_runtime::Balances, + Assets: asset_hub_westend_runtime::Assets, + ForeignAssets: asset_hub_westend_runtime::ForeignAssets, + PoolAssets: asset_hub_westend_runtime::PoolAssets, + AssetConversion: asset_hub_westend_runtime::AssetConversion, + SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, + Revive: asset_hub_westend_runtime::Revive, + } + }, + } ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - + decl_test_bridges! { + pub struct RococoWestendMockBridge { + source = BridgeHubRococoPara, + target = BridgeHubWestendPara, + handler = RococoWestendMessageHandler + }, + pub struct WestendRococoMockBridge { + source = BridgeHubWestendPara, + target = BridgeHubRococoPara, + handler = WestendRococoMessageHandler + } + } ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - + decl_test_networks! { + pub struct WestendMockNet { + relay_chain = Westend, + parachains = vec![ + AssetHubWestend, + BridgeHubWestend, + CollectivesWestend, + CoretimeWestend, + PeopleWestend, + PenpalA, + PenpalB, + ], + bridge = () + }, + } ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -16091,7 +16415,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust - +fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; ``` ??? interface "Input parameters" @@ -16368,7 +16692,7 @@ This API allows a dry-run of any extrinsic and obtaining the outcome if it fails This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust - +fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; ``` ??? interface "Input parameters" @@ -16599,7 +16923,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust - +fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; ``` ??? interface "Input parameters" diff --git a/.ai/categories/polkadot-protocol.md b/.ai/categories/polkadot-protocol.md index d88e67c5c..afb31a425 100644 --- a/.ai/categories/polkadot-protocol.md +++ b/.ai/categories/polkadot-protocol.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - + [dependencies] ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -160,11 +160,9 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 3. In the `[features]` section, add the custom pallet to the `std` feature list: ```toml hl_lines="5" title="Cargo.toml" - [features] - default = ["std"] - std = [ + ... - "custom-pallet/std", + ... ] ``` @@ -288,13 +286,63 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - + [workspace.package] + license = "MIT-0" + authors = ["Parity Technologies "] + homepage = "https://paritytech.github.io/polkadot-sdk/" + repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" + edition = "2021" + + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] + resolver = "2" + + [workspace.dependencies] + parachain-template-runtime = { path = "./runtime", default-features = false } + pallet-parachain-template = { path = "./pallets/template", default-features = false } + clap = { version = "4.5.13" } + color-print = { version = "0.3.4" } + docify = { version = "0.2.9" } + futures = { version = "0.3.31" } + jsonrpsee = { version = "0.24.3" } + log = { version = "0.4.22", default-features = false } + polkadot-sdk = { version = "2503.0.1", default-features = false } + prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } + serde = { version = "1.0.214", default-features = false } + codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } + cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } + hex-literal = { version = "0.4.1", default-features = false } + scale-info = { version = "2.11.6", default-features = false } + serde_json = { version = "1.0.132", default-features = false } + smallvec = { version = "1.11.0", default-features = false } + substrate-wasm-builder = { version = "26.0.1", default-features = false } + frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } + + [profile.release] + opt-level = 3 + panic = "unwind" + + [profile.production] + codegen-units = 1 + inherits = "release" + lto = true ``` @@ -2055,13 +2103,53 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ## Understanding the Code @@ -7003,7 +7091,16 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - + /// The full account information for a particular account ID. + #[pallet::storage] + #[pallet::getter(fn account)] + pub type Account = StorageMap< + _, + Blake2_128Concat, + T::AccountId, + AccountInfo, + ValueQuery, + >; ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -7027,7 +7124,24 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs - +/// Information of an account. +#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] +pub struct AccountInfo { + /// The number of transactions this account has sent. + pub nonce: Nonce, + /// The number of other modules that currently depend on this account's existence. The account + /// cannot be reaped until this is zero. + pub consumers: RefCount, + /// The number of other modules that allow this account to exist. The account may not be reaped + /// until this and `sufficients` are both zero. + pub providers: RefCount, + /// The number of modules that allow this account to exist for their own purposes only. The + /// account may not be reaped until this and `providers` are both zero. + pub sufficients: RefCount, + /// The additional data that belongs to this account. Used to store the balance(s) in a lot of + /// chains. + pub data: AccountData, +} ``` The `AccountInfo` structure includes the following components: @@ -7870,7 +7984,8 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust - +pub type PriceForChildParachainDelivery = + ExponentialPrice; ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -8692,19 +8807,69 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - + decl_test_parachains! { + pub struct AssetHubWestend { + genesis = genesis::genesis(), + on_init = { + asset_hub_westend_runtime::AuraExt::on_initialize(1); + }, + runtime = asset_hub_westend_runtime, + core = { + XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, + LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, + ParachainInfo: asset_hub_westend_runtime::ParachainInfo, + MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, + DigestProvider: (), + }, + pallets = { + PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, + Balances: asset_hub_westend_runtime::Balances, + Assets: asset_hub_westend_runtime::Assets, + ForeignAssets: asset_hub_westend_runtime::ForeignAssets, + PoolAssets: asset_hub_westend_runtime::PoolAssets, + AssetConversion: asset_hub_westend_runtime::AssetConversion, + SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, + Revive: asset_hub_westend_runtime::Revive, + } + }, + } ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - + decl_test_bridges! { + pub struct RococoWestendMockBridge { + source = BridgeHubRococoPara, + target = BridgeHubWestendPara, + handler = RococoWestendMessageHandler + }, + pub struct WestendRococoMockBridge { + source = BridgeHubWestendPara, + target = BridgeHubRococoPara, + handler = WestendRococoMessageHandler + } + } ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - + decl_test_networks! { + pub struct WestendMockNet { + relay_chain = Westend, + parachains = vec![ + AssetHubWestend, + BridgeHubWestend, + CollectivesWestend, + CoretimeWestend, + PeopleWestend, + PenpalA, + PenpalB, + ], + bridge = () + }, + } ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -9889,7 +10054,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust - +fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; ``` ??? interface "Input parameters" @@ -10166,7 +10331,7 @@ This API allows a dry-run of any extrinsic and obtaining the outcome if it fails This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust - +fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; ``` ??? interface "Input parameters" @@ -10397,7 +10562,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust - +fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; ``` ??? interface "Input parameters" diff --git a/.ai/categories/reference.md b/.ai/categories/reference.md index 10698cf30..bb90f4a59 100644 --- a/.ai/categories/reference.md +++ b/.ai/categories/reference.md @@ -1610,7 +1610,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust - +fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; ``` ??? interface "Input parameters" @@ -1887,7 +1887,7 @@ This API allows a dry-run of any extrinsic and obtaining the outcome if it fails This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust - +fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; ``` ??? interface "Input parameters" @@ -2118,7 +2118,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust - +fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; ``` ??? interface "Input parameters" diff --git a/.ai/categories/smart-contracts.md b/.ai/categories/smart-contracts.md index d3d25b143..d8fa7110c 100644 --- a/.ai/categories/smart-contracts.md +++ b/.ai/categories/smart-contracts.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - + [dependencies] ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -160,11 +160,9 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 3. In the `[features]` section, add the custom pallet to the `std` feature list: ```toml hl_lines="5" title="Cargo.toml" - [features] - default = ["std"] - std = [ + ... - "custom-pallet/std", + ... ] ``` @@ -288,13 +286,63 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - + [workspace.package] + license = "MIT-0" + authors = ["Parity Technologies "] + homepage = "https://paritytech.github.io/polkadot-sdk/" + repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" + edition = "2021" + + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] + resolver = "2" + + [workspace.dependencies] + parachain-template-runtime = { path = "./runtime", default-features = false } + pallet-parachain-template = { path = "./pallets/template", default-features = false } + clap = { version = "4.5.13" } + color-print = { version = "0.3.4" } + docify = { version = "0.2.9" } + futures = { version = "0.3.31" } + jsonrpsee = { version = "0.24.3" } + log = { version = "0.4.22", default-features = false } + polkadot-sdk = { version = "2503.0.1", default-features = false } + prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } + serde = { version = "1.0.214", default-features = false } + codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } + cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } + hex-literal = { version = "0.4.1", default-features = false } + scale-info = { version = "2.11.6", default-features = false } + serde_json = { version = "1.0.132", default-features = false } + smallvec = { version = "1.11.0", default-features = false } + substrate-wasm-builder = { version = "26.0.1", default-features = false } + frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } + + [profile.release] + opt-level = 3 + panic = "unwind" + + [profile.production] + codegen-units = 1 + inherits = "release" + lto = true ``` @@ -1784,13 +1832,53 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ## Understanding the Code @@ -4494,7 +4582,30 @@ To interact with the ECRecover precompile, you can deploy the `ECRecoverExample` The SHA-256 precompile computes the SHA-256 hash of the input data. ```solidity title="SHA256.sol" +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.0; + +contract SHA256Example { + event SHA256Called(bytes result); + + // Address of the SHA256 precompile + address constant SHA256_PRECOMPILE = address(0x02); + bytes public result; + + function callH256(bytes calldata input) public { + bool success; + bytes memory resultInMemory; + + (success, resultInMemory) = SHA256_PRECOMPILE.call{value: 0}(input); + + if (success) { + emit SHA256Called(resultInMemory); + } + + result = resultInMemory; + } +} ``` To use it, you can deploy the `SHA256Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call callH256 with arbitrary bytes. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/SHA256.js){target=\_blank} shows how to pass a UTF-8 string, hash it using the precompile, and compare it with the expected hash from Node.js's [crypto](https://www.npmjs.com/package/crypto-js){target=\_blank} module. @@ -4611,7 +4722,38 @@ To use it, you can deploy the `ModExpExample` contract in [Remix](/develop/smart The BN128Add precompile performs addition on the alt_bn128 elliptic curve, which is essential for zk-SNARK operations. ```solidity title="BN128Add.sol" +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.20; + +contract BN128AddExample { + address constant BN128_ADD_PRECOMPILE = address(0x06); + + event BN128Added(uint256 x3, uint256 y3); + + uint256 public resultX; + uint256 public resultY; + + function callBN128Add(uint256 x1, uint256 y1, uint256 x2, uint256 y2) public { + bytes memory input = abi.encodePacked( + bytes32(x1), bytes32(y1), bytes32(x2), bytes32(y2) + ); + bool success; + bytes memory output; + + (success, output) = BN128_ADD_PRECOMPILE.call{value: 0}(input); + + require(success, "BN128Add precompile call failed"); + require(output.length == 64, "Invalid output length"); + + (uint256 x3, uint256 y3) = abi.decode(output, (uint256, uint256)); + + resultX = x3; + resultY = y3; + + emit BN128Added(x3, y3); + } +} ``` To use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `callBN128Add` with valid `alt_bn128` points. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Add.js){target=\_blank} demonstrates a valid curve addition and checks the result against known expected values. @@ -4621,7 +4763,42 @@ To use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/sma The BN128Mul precompile performs scalar multiplication on the alt_bn128 curve. ```solidity title="BN128Mul.sol" +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.0; +contract BN128MulExample { + // Precompile address for BN128Mul + address constant BN128_MUL_ADDRESS = address(0x07); + + bytes public result; + + // Performs scalar multiplication of a point on the alt_bn128 curve + function bn128ScalarMul(uint256 x1, uint256 y1, uint256 scalar) public { + // Format: [x, y, scalar] - each 32 bytes + bytes memory input = abi.encodePacked( + bytes32(x1), + bytes32(y1), + bytes32(scalar) + ); + + (bool success, bytes memory resultInMemory) = BN128_MUL_ADDRESS.call{ + value: 0 + }(input); + require(success, "BN128Mul precompile call failed"); + + result = resultInMemory; + } + + // Helper to decode result from `result` storage + function getResult() public view returns (uint256 x2, uint256 y2) { + bytes memory tempResult = result; + require(tempResult.length >= 64, "Invalid result length"); + assembly { + x2 := mload(add(tempResult, 32)) + y2 := mload(add(tempResult, 64)) + } + } +} ``` To use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `bn128ScalarMul` with a valid point and scalar. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Mul.js){target=\_blank} shows how to test the operation and verify the expected scalar multiplication result on `alt_bn128`. @@ -4631,7 +4808,38 @@ To use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-envi The BN128Pairing precompile verifies a pairing equation on the alt_bn128 curve, which is critical for zk-SNARK verification. ```solidity title="BN128Pairing.sol" +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.0; + +contract BN128PairingExample { + // Precompile address for BN128Pairing + address constant BN128_PAIRING_ADDRESS = address(0x08); + + bytes public result; + // Performs a pairing check on the alt_bn128 curve + function bn128Pairing(bytes memory input) public { + // Call the precompile + (bool success, bytes memory resultInMemory) = BN128_PAIRING_ADDRESS + .call{value: 0}(input); + require(success, "BN128Pairing precompile call failed"); + + result = resultInMemory; + } + + // Helper function to decode the result from `result` storage + function getResult() public view returns (bool isValid) { + bytes memory tempResult = result; + require(tempResult.length == 32, "Invalid result length"); + + uint256 output; + assembly { + output := mload(add(tempResult, 32)) + } + + isValid = (output == 1); + } +} ``` You can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or your preferred environment. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Pairing.js){target=\_blank} contains these tests with working examples. @@ -4641,7 +4849,105 @@ You can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-env The Blake2F precompile performs the Blake2 compression function F, which is the core of the Blake2 hash function. ```solidity title="Blake2F.sol" +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.0; + +contract Blake2FExample { + // Precompile address for Blake2F + address constant BLAKE2F_ADDRESS = address(0x09); + + bytes public result; + + function blake2F(bytes memory input) public { + // Input must be exactly 213 bytes + require(input.length == 213, "Invalid input length - must be 213 bytes"); + + // Call the precompile + (bool success, bytes memory resultInMemory) = BLAKE2F_ADDRESS.call{ + value: 0 + }(input); + require(success, "Blake2F precompile call failed"); + + result = resultInMemory; + } + // Helper function to decode the result from `result` storage + function getResult() public view returns (bytes32[8] memory output) { + bytes memory tempResult = result; + require(tempResult.length == 64, "Invalid result length"); + + for (uint i = 0; i < 8; i++) { + assembly { + mstore(add(output, mul(32, i)), mload(add(add(tempResult, 32), mul(32, i)))) + } + } + } + + + // Helper function to create Blake2F input from parameters + function createBlake2FInput( + uint32 rounds, + bytes32[8] memory h, + bytes32[16] memory m, + bytes8[2] memory t, + bool f + ) public pure returns (bytes memory) { + // Start with rounds (4 bytes, big-endian) + bytes memory input = abi.encodePacked(rounds); + + // Add state vector h (8 * 32 = 256 bytes) + for (uint i = 0; i < 8; i++) { + input = abi.encodePacked(input, h[i]); + } + + // Add message block m (16 * 32 = 512 bytes, but we need to convert to 16 * 8 = 128 bytes) + // Blake2F expects 64-bit words in little-endian format + for (uint i = 0; i < 16; i++) { + // Take only the first 8 bytes of each bytes32 and reverse for little-endian + bytes8 word = bytes8(m[i]); + input = abi.encodePacked(input, word); + } + + // Add offset counters t (2 * 8 = 16 bytes) + input = abi.encodePacked(input, t[0], t[1]); + + // Add final block flag (1 byte) + input = abi.encodePacked(input, f ? bytes1(0x01) : bytes1(0x00)); + + return input; + } + + // Simplified function that works with raw hex input + function blake2FFromHex(string memory hexInput) public { + bytes memory input = hexStringToBytes(hexInput); + blake2F(input); + } + + // Helper function to convert hex string to bytes + function hexStringToBytes(string memory hexString) public pure returns (bytes memory) { + bytes memory hexBytes = bytes(hexString); + require(hexBytes.length % 2 == 0, "Invalid hex string length"); + + bytes memory result = new bytes(hexBytes.length / 2); + + for (uint i = 0; i < hexBytes.length / 2; i++) { + result[i] = bytes1( + (hexCharToByte(hexBytes[2 * i]) << 4) | + hexCharToByte(hexBytes[2 * i + 1]) + ); + } + + return result; + } + + function hexCharToByte(bytes1 char) internal pure returns (uint8) { + uint8 c = uint8(char); + if (c >= 48 && c <= 57) return c - 48; // 0-9 + if (c >= 65 && c <= 70) return c - 55; // A-F + if (c >= 97 && c <= 102) return c - 87; // a-f + revert("Invalid hex character"); + } +} ``` To use it, deploy `Blake2FExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `callBlake2F` with the properly formatted input parameters for rounds, state vector, message block, offset counters, and final block flag. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Blake2.js){target=\_blank} demonstrates how to perform Blake2 compression with different rounds and verify the correctness of the output against known test vectors. @@ -7687,7 +7993,16 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - + /// The full account information for a particular account ID. + #[pallet::storage] + #[pallet::getter(fn account)] + pub type Account = StorageMap< + _, + Blake2_128Concat, + T::AccountId, + AccountInfo, + ValueQuery, + >; ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -7711,7 +8026,24 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs - +/// Information of an account. +#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] +pub struct AccountInfo { + /// The number of transactions this account has sent. + pub nonce: Nonce, + /// The number of other modules that currently depend on this account's existence. The account + /// cannot be reaped until this is zero. + pub consumers: RefCount, + /// The number of other modules that allow this account to exist. The account may not be reaped + /// until this and `sufficients` are both zero. + pub providers: RefCount, + /// The number of modules that allow this account to exist for their own purposes only. The + /// account may not be reaped until this and `providers` are both zero. + pub sufficients: RefCount, + /// The additional data that belongs to this account. Used to store the balance(s) in a lot of + /// chains. + pub data: AccountData, +} ``` The `AccountInfo` structure includes the following components: @@ -8422,7 +8754,8 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust - +pub type PriceForChildParachainDelivery = + ExponentialPrice; ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -9244,19 +9577,69 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - + decl_test_parachains! { + pub struct AssetHubWestend { + genesis = genesis::genesis(), + on_init = { + asset_hub_westend_runtime::AuraExt::on_initialize(1); + }, + runtime = asset_hub_westend_runtime, + core = { + XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, + LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, + ParachainInfo: asset_hub_westend_runtime::ParachainInfo, + MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, + DigestProvider: (), + }, + pallets = { + PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, + Balances: asset_hub_westend_runtime::Balances, + Assets: asset_hub_westend_runtime::Assets, + ForeignAssets: asset_hub_westend_runtime::ForeignAssets, + PoolAssets: asset_hub_westend_runtime::PoolAssets, + AssetConversion: asset_hub_westend_runtime::AssetConversion, + SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, + Revive: asset_hub_westend_runtime::Revive, + } + }, + } ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - + decl_test_bridges! { + pub struct RococoWestendMockBridge { + source = BridgeHubRococoPara, + target = BridgeHubWestendPara, + handler = RococoWestendMessageHandler + }, + pub struct WestendRococoMockBridge { + source = BridgeHubWestendPara, + target = BridgeHubRococoPara, + handler = WestendRococoMessageHandler + } + } ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - + decl_test_networks! { + pub struct WestendMockNet { + relay_chain = Westend, + parachains = vec![ + AssetHubWestend, + BridgeHubWestend, + CollectivesWestend, + CoretimeWestend, + PeopleWestend, + PenpalA, + PenpalB, + ], + bridge = () + }, + } ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -12982,7 +13365,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust - +fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; ``` ??? interface "Input parameters" @@ -13259,7 +13642,7 @@ This API allows a dry-run of any extrinsic and obtaining the outcome if it fails This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust - +fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; ``` ??? interface "Input parameters" @@ -13490,7 +13873,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust - +fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; ``` ??? interface "Input parameters" diff --git a/.ai/categories/tooling.md b/.ai/categories/tooling.md index eef78fbc5..e0a1d2e09 100644 --- a/.ai/categories/tooling.md +++ b/.ai/categories/tooling.md @@ -141,7 +141,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - + [dependencies] ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -160,11 +160,9 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 3. In the `[features]` section, add the custom pallet to the `std` feature list: ```toml hl_lines="5" title="Cargo.toml" - [features] - default = ["std"] - std = [ + ... - "custom-pallet/std", + ... ] ``` @@ -288,13 +286,63 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - + [workspace.package] + license = "MIT-0" + authors = ["Parity Technologies "] + homepage = "https://paritytech.github.io/polkadot-sdk/" + repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" + edition = "2021" + + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] + resolver = "2" + + [workspace.dependencies] + parachain-template-runtime = { path = "./runtime", default-features = false } + pallet-parachain-template = { path = "./pallets/template", default-features = false } + clap = { version = "4.5.13" } + color-print = { version = "0.3.4" } + docify = { version = "0.2.9" } + futures = { version = "0.3.31" } + jsonrpsee = { version = "0.24.3" } + log = { version = "0.4.22", default-features = false } + polkadot-sdk = { version = "2503.0.1", default-features = false } + prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } + serde = { version = "1.0.214", default-features = false } + codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } + cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } + hex-literal = { version = "0.4.1", default-features = false } + scale-info = { version = "2.11.6", default-features = false } + serde_json = { version = "1.0.132", default-features = false } + smallvec = { version = "1.11.0", default-features = false } + substrate-wasm-builder = { version = "26.0.1", default-features = false } + frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } + + [profile.release] + opt-level = 3 + panic = "unwind" + + [profile.production] + codegen-units = 1 + inherits = "release" + lto = true ``` @@ -1660,7 +1708,31 @@ npm install ethers@6.13.5 To interact with the Polkadot Hub, you need to set up an [Ethers.js Provider](/develop/smart-contracts/libraries/ethers-js/#set-up-the-ethersjs-provider){target=\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/ethers.js` and add the following code: ```javascript title="app/utils/ethers.js" +import { JsonRpcProvider } from 'ethers'; +export const PASSET_HUB_CONFIG = { + name: 'Passet Hub', + rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io/', // Passet Hub testnet RPC + chainId: 420420422, // Passet Hub testnet chainId + blockExplorer: 'https://blockscout-passet-hub.parity-testnet.parity.io/', +}; + +export const getProvider = () => { + return new JsonRpcProvider(PASSET_HUB_CONFIG.rpc, { + chainId: PASSET_HUB_CONFIG.chainId, + name: PASSET_HUB_CONFIG.name, + }); +}; + +// Helper to get a signer from a provider +export const getSigner = async (provider) => { + if (window.ethereum) { + await window.ethereum.request({ method: 'eth_requestAccounts' }); + const ethersProvider = new ethers.BrowserProvider(window.ethereum); + return ethersProvider.getSigner(); + } + throw new Error('No Ethereum browser provider detected'); +}; ``` This file establishes a connection to the Polkadot Hub TestNet and provides helper functions for obtaining a [Provider](https://docs.ethers.org/v5/api/providers/provider/){target=_blank} and [Signer](https://docs.ethers.org/v5/api/signer/){target=_blank}. The provider allows you to read data from the blockchain, while the signer enables users to send transactions and modify the blockchain state. @@ -1672,13 +1744,55 @@ For this dApp, you'll use a simple Storage contract already deployed. So, you ne ???+ code "Storage.sol ABI" ```json title="abis/Storage.json" - + [ + { + "inputs": [ + { + "internalType": "uint256", + "name": "_newNumber", + "type": "uint256" + } + ], + "name": "setNumber", + "outputs": [], + "stateMutability": "nonpayable", + "type": "function" + }, + { + "inputs": [], + "name": "storedNumber", + "outputs": [ + { + "internalType": "uint256", + "name": "", + "type": "uint256" + } + ], + "stateMutability": "view", + "type": "function" + } + ] ``` Now, create a file called `app/utils/contract.js`: ```javascript title="app/utils/contract.js" +import { Contract } from 'ethers'; +import { getProvider } from './ethers'; +import StorageABI from '../../abis/Storage.json'; + +export const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f'; + +export const CONTRACT_ABI = StorageABI; + +export const getContract = () => { + const provider = getProvider(); + return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, provider); +}; +export const getSignedContract = async (signer) => { + return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, signer); +}; ``` This file defines the contract address, ABI, and functions to create instances of the contract for reading and writing. @@ -1688,7 +1802,167 @@ This file defines the contract address, ABI, and functions to create instances o Next, let's create a component to handle wallet connections. Create a new file called `app/components/WalletConnect.js`: ```javascript title="app/components/WalletConnect.js" +'use client'; + +import React, { useState, useEffect } from 'react'; +import { PASSET_HUB_CONFIG } from '../utils/ethers'; + +const WalletConnect = ({ onConnect }) => { + const [account, setAccount] = useState(null); + const [chainId, setChainId] = useState(null); + const [error, setError] = useState(null); + useEffect(() => { + // Check if user already has an authorized wallet connection + const checkConnection = async () => { + if (window.ethereum) { + try { + // eth_accounts doesn't trigger the wallet popup + const accounts = await window.ethereum.request({ + method: 'eth_accounts', + }); + if (accounts.length > 0) { + setAccount(accounts[0]); + const chainIdHex = await window.ethereum.request({ + method: 'eth_chainId', + }); + setChainId(parseInt(chainIdHex, 16)); + } + } catch (err) { + console.error('Error checking connection:', err); + setError('Failed to check wallet connection'); + } + } + }; + + checkConnection(); + + if (window.ethereum) { + // Setup wallet event listeners + window.ethereum.on('accountsChanged', (accounts) => { + setAccount(accounts[0] || null); + if (accounts[0] && onConnect) onConnect(accounts[0]); + }); + + window.ethereum.on('chainChanged', (chainIdHex) => { + setChainId(parseInt(chainIdHex, 16)); + }); + } + + return () => { + // Cleanup event listeners + if (window.ethereum) { + window.ethereum.removeListener('accountsChanged', () => {}); + window.ethereum.removeListener('chainChanged', () => {}); + } + }; + }, [onConnect]); + + const connectWallet = async () => { + if (!window.ethereum) { + setError( + 'MetaMask not detected! Please install MetaMask to use this dApp.' + ); + return; + } + + try { + // eth_requestAccounts triggers the wallet popup + const accounts = await window.ethereum.request({ + method: 'eth_requestAccounts', + }); + setAccount(accounts[0]); + + const chainIdHex = await window.ethereum.request({ + method: 'eth_chainId', + }); + const currentChainId = parseInt(chainIdHex, 16); + setChainId(currentChainId); + + // Prompt user to switch networks if needed + if (currentChainId !== PASSET_HUB_CONFIG.chainId) { + await switchNetwork(); + } + + if (onConnect) onConnect(accounts[0]); + } catch (err) { + console.error('Error connecting to wallet:', err); + setError('Failed to connect wallet'); + } + }; + + const switchNetwork = async () => { + try { + await window.ethereum.request({ + method: 'wallet_switchEthereumChain', + params: [{ chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}` }], + }); + } catch (switchError) { + // Error 4902 means the chain hasn't been added to MetaMask + if (switchError.code === 4902) { + try { + await window.ethereum.request({ + method: 'wallet_addEthereumChain', + params: [ + { + chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}`, + chainName: PASSET_HUB_CONFIG.name, + rpcUrls: [PASSET_HUB_CONFIG.rpc], + blockExplorerUrls: [PASSET_HUB_CONFIG.blockExplorer], + }, + ], + }); + } catch (addError) { + setError('Failed to add network to wallet'); + } + } else { + setError('Failed to switch network'); + } + } + }; + + // UI-only disconnection - MetaMask doesn't support programmatic disconnection + const disconnectWallet = () => { + setAccount(null); + }; + + return ( +
+ {error &&

{error}

} + + {!account ? ( + + ) : ( +
+ + {`${account.substring(0, 6)}...${account.substring(38)}`} + + + {chainId !== PASSET_HUB_CONFIG.chainId && ( + + )} +
+ )} +
+ ); +}; + +export default WalletConnect; ``` This component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. @@ -1697,9 +1971,25 @@ To integrate this component to your dApp, you need to overwrite the existing boi ```javascript title="app/page.js" +import { useState } from 'react'; +import WalletConnect from './components/WalletConnect'; +export default function Home() { + const [account, setAccount] = useState(null); + const handleConnect = (connectedAccount) => { + setAccount(connectedAccount); + }; + return ( +
+

+ Ethers.js dApp - Passet Hub Smart Contracts +

+ +
+ ); +} ``` In your terminal, you can launch your project by running: @@ -1717,7 +2007,64 @@ And you will see the following: Now, let's create a component to read data from the contract. Create a file called `app/components/ReadContract.js`: ```javascript title="app/components/ReadContract.js" +'use client'; + +import React, { useState, useEffect } from 'react'; +import { getContract } from '../utils/contract'; + +const ReadContract = () => { + const [storedNumber, setStoredNumber] = useState(null); + const [loading, setLoading] = useState(true); + const [error, setError] = useState(null); + + useEffect(() => { + // Function to read data from the blockchain + const fetchData = async () => { + try { + setLoading(true); + const contract = getContract(); + // Call the smart contract's storedNumber function + const number = await contract.storedNumber(); + setStoredNumber(number.toString()); + setError(null); + } catch (err) { + console.error('Error fetching stored number:', err); + setError('Failed to fetch data from the contract'); + } finally { + setLoading(false); + } + }; + + fetchData(); + + // Poll for updates every 10 seconds to keep UI in sync with blockchain + const interval = setInterval(fetchData, 10000); + + // Clean up interval on component unmount + return () => clearInterval(interval); + }, []); + + return ( +
+

Contract Data

+ {loading ? ( +
+
+
+ ) : error ? ( +

{error}

+ ) : ( +
+

+ Stored Number: {storedNumber} +

+
+ )} +
+ ); +}; +export default ReadContract; ``` This component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically. @@ -1726,9 +2073,27 @@ To see this change in your dApp, you need to integrate this component into the ` ```javascript title="app/page.js" +import { useState } from 'react'; +import WalletConnect from './components/WalletConnect'; +import ReadContract from './components/ReadContract'; +export default function Home() { + const [account, setAccount] = useState(null); + const handleConnect = (connectedAccount) => { + setAccount(connectedAccount); + }; + return ( +
+

+ Ethers.js dApp - Passet Hub Smart Contracts +

+ + +
+ ); +} ``` Your dApp will automatically be updated to the following: @@ -1740,7 +2105,119 @@ Your dApp will automatically be updated to the following: Finally, let's create a component that allows users to update the stored number. Create a file called `app/components/WriteContract.js`: ```javascript title="app/components/WriteContract.js" +'use client'; + +import { useState } from 'react'; +import { getSignedContract } from '../utils/contract'; +import { ethers } from 'ethers'; + +const WriteContract = ({ account }) => { + const [newNumber, setNewNumber] = useState(''); + const [status, setStatus] = useState({ type: null, message: '' }); + const [isSubmitting, setIsSubmitting] = useState(false); + + const handleSubmit = async (e) => { + e.preventDefault(); + + // Validation checks + if (!account) { + setStatus({ type: 'error', message: 'Please connect your wallet first' }); + return; + } + + if (!newNumber || isNaN(Number(newNumber))) { + setStatus({ type: 'error', message: 'Please enter a valid number' }); + return; + } + + try { + setIsSubmitting(true); + setStatus({ type: 'info', message: 'Initiating transaction...' }); + + // Get a signer from the connected wallet + const provider = new ethers.BrowserProvider(window.ethereum); + const signer = await provider.getSigner(); + const contract = await getSignedContract(signer); + + // Send transaction to blockchain and wait for user confirmation in wallet + setStatus({ + type: 'info', + message: 'Please confirm the transaction in your wallet...', + }); + + // Call the contract's setNumber function + const tx = await contract.setNumber(newNumber); + + // Wait for transaction to be mined + setStatus({ + type: 'info', + message: 'Transaction submitted. Waiting for confirmation...', + }); + const receipt = await tx.wait(); + + setStatus({ + type: 'success', + message: `Transaction confirmed! Transaction hash: ${receipt.hash}`, + }); + setNewNumber(''); + } catch (err) { + console.error('Error updating number:', err); + + // Error code 4001 is MetaMask's code for user rejection + if (err.code === 4001) { + setStatus({ type: 'error', message: 'Transaction rejected by user.' }); + } else { + setStatus({ + type: 'error', + message: `Error: ${err.message || 'Failed to send transaction'}`, + }); + } + } finally { + setIsSubmitting(false); + } + }; + + return ( +
+

Update Stored Number

+ {status.message && ( +
+ {status.message} +
+ )} +
+ setNewNumber(e.target.value)} + disabled={isSubmitting || !account} + className="w-full p-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-pink-400" + /> + +
+ {!account && ( +

+ Connect your wallet to update the stored number. +

+ )} +
+ ); +}; +export default WriteContract; ``` This component allows users to input a new number and send a transaction to update the value stored in the contract. When the transaction is successful, users will see the stored value update in the `ReadContract` component after the transaction is confirmed. @@ -1748,7 +2225,32 @@ This component allows users to input a new number and send a transaction to upda Update the `app/page.js` file to integrate all components: ```javascript title="app/page.js" +'use client'; + +import { useState } from 'react'; + +import WalletConnect from './components/WalletConnect'; +import ReadContract from './components/ReadContract'; +import WriteContract from './components/WriteContract'; + +export default function Home() { + const [account, setAccount] = useState(null); + + const handleConnect = (connectedAccount) => { + setAccount(connectedAccount); + }; + return ( +
+

+ Ethers.js dApp - Passet Hub Smart Contracts +

+ + + +
+ ); +} ``` The completed UI will display: @@ -1853,7 +2355,47 @@ npm install --save-dev typescript @types/node To interact with Polkadot Hub, you need to set up a [Public Client](https://viem.sh/docs/clients/public#public-client){target=\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/viem.ts` and add the following code: ```typescript title="viem.ts" +import { createPublicClient, http, createWalletClient, custom } from 'viem' +import 'viem/window'; + +const transport = http('https://testnet-passet-hub-eth-rpc.polkadot.io') + +// Configure the Passet Hub chain +export const passetHub = { + id: 420420422, + name: 'Passet Hub', + network: 'passet-hub', + nativeCurrency: { + decimals: 18, + name: 'PAS', + symbol: 'PAS', + }, + rpcUrls: { + default: { + http: ['https://testnet-passet-hub-eth-rpc.polkadot.io'], + }, + }, +} as const + +// Create a public client for reading data +export const publicClient = createPublicClient({ + chain: passetHub, + transport +}) + +// Create a wallet client for signing transactions +export const getWalletClient = async () => { + if (typeof window !== 'undefined' && window.ethereum) { + const [account] = await window.ethereum.request({ method: 'eth_requestAccounts' }); + return createWalletClient({ + chain: passetHub, + transport: custom(window.ethereum), + account, + }); + } + throw new Error('No Ethereum browser provider detected'); +}; ``` This file initializes a viem client, providing helper functions for obtaining a Public Client and a [Wallet Client](https://viem.sh/docs/clients/wallet#wallet-client){target=\_blank}. The Public Client enables reading blockchain data, while the Wallet Client allows users to sign and send transactions. Also, note that by importing `'viem/window'` the global `window.ethereum` will be typed as an `EIP1193Provider`, check the [`window` Polyfill](https://viem.sh/docs/typescript#window-polyfill){target=\_blank} reference for more information. @@ -1899,7 +2441,31 @@ Create a folder called `abis` at the root of your project, then create a file na Next, create a file called `utils/contract.ts`: ```typescript title="contract.ts" +import { getContract } from 'viem'; +import { publicClient, getWalletClient } from './viem'; +import StorageABI from '../../abis/Storage.json'; + +export const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f'; +export const CONTRACT_ABI = StorageABI; +// Create a function to get a contract instance for reading +export const getContractInstance = () => { + return getContract({ + address: CONTRACT_ADDRESS, + abi: CONTRACT_ABI, + client: publicClient, + }); +}; + +// Create a function to get a contract instance with a signer for writing +export const getSignedContract = async () => { + const walletClient = await getWalletClient(); + return getContract({ + address: CONTRACT_ADDRESS, + abi: CONTRACT_ABI, + client: walletClient, + }); +}; ``` This file defines the contract address, ABI, and functions to create a viem [contract instance](https://viem.sh/docs/contract/getContract#contract-instances){target=\_blank} for reading and writing operations. viem's contract utilities ensure a more efficient and type-safe interaction with smart contracts. @@ -1909,7 +2475,180 @@ This file defines the contract address, ABI, and functions to create a viem [con Now, let's create a component to handle wallet connections. Create a new file called `components/WalletConnect.tsx`: ```typescript title="WalletConnect.tsx" +"use client"; + +import React, { useState, useEffect } from "react"; +import { passetHub } from "../utils/viem"; + +interface WalletConnectProps { + onConnect: (account: string) => void; +} + +const WalletConnect: React.FC = ({ onConnect }) => { + const [account, setAccount] = useState(null); + const [chainId, setChainId] = useState(null); + const [error, setError] = useState(null); + + useEffect(() => { + // Check if user already has an authorized wallet connection + const checkConnection = async () => { + if (typeof window !== 'undefined' && window.ethereum) { + try { + // eth_accounts doesn't trigger the wallet popup + const accounts = await window.ethereum.request({ + method: 'eth_accounts', + }) as string[]; + + if (accounts.length > 0) { + setAccount(accounts[0]); + const chainIdHex = await window.ethereum.request({ + method: 'eth_chainId', + }) as string; + setChainId(parseInt(chainIdHex, 16)); + onConnect(accounts[0]); + } + } catch (err) { + console.error('Error checking connection:', err); + setError('Failed to check wallet connection'); + } + } + }; + + checkConnection(); + + if (typeof window !== 'undefined' && window.ethereum) { + // Setup wallet event listeners + window.ethereum.on('accountsChanged', (accounts: string[]) => { + setAccount(accounts[0] || null); + if (accounts[0]) onConnect(accounts[0]); + }); + + window.ethereum.on('chainChanged', (chainIdHex: string) => { + setChainId(parseInt(chainIdHex, 16)); + }); + } + + return () => { + // Cleanup event listeners + if (typeof window !== 'undefined' && window.ethereum) { + window.ethereum.removeListener('accountsChanged', () => {}); + window.ethereum.removeListener('chainChanged', () => {}); + } + }; + }, [onConnect]); + + const connectWallet = async () => { + if (typeof window === 'undefined' || !window.ethereum) { + setError( + 'MetaMask not detected! Please install MetaMask to use this dApp.' + ); + return; + } + + try { + // eth_requestAccounts triggers the wallet popup + const accounts = await window.ethereum.request({ + method: 'eth_requestAccounts', + }) as string[]; + + setAccount(accounts[0]); + + const chainIdHex = await window.ethereum.request({ + method: 'eth_chainId', + }) as string; + + const currentChainId = parseInt(chainIdHex, 16); + setChainId(currentChainId); + + // Prompt user to switch networks if needed + if (currentChainId !== passetHub.id) { + await switchNetwork(); + } + + onConnect(accounts[0]); + } catch (err) { + console.error('Error connecting to wallet:', err); + setError('Failed to connect wallet'); + } + }; + + const switchNetwork = async () => { + console.log('Switch network') + try { + await window.ethereum.request({ + method: 'wallet_switchEthereumChain', + params: [{ chainId: `0x${passetHub.id.toString(16)}` }], + }); + } catch (switchError: any) { + // Error 4902 means the chain hasn't been added to MetaMask + if (switchError.code === 4902) { + try { + await window.ethereum.request({ + method: 'wallet_addEthereumChain', + params: [ + { + chainId: `0x${passetHub.id.toString(16)}`, + chainName: passetHub.name, + rpcUrls: [passetHub.rpcUrls.default.http[0]], + nativeCurrency: { + name: passetHub.nativeCurrency.name, + symbol: passetHub.nativeCurrency.symbol, + decimals: passetHub.nativeCurrency.decimals, + }, + }, + ], + }); + } catch (addError) { + setError('Failed to add network to wallet'); + } + } else { + setError('Failed to switch network'); + } + } + }; + + // UI-only disconnection - MetaMask doesn't support programmatic disconnection + const disconnectWallet = () => { + setAccount(null); + }; + + return ( +
+ {error &&

{error}

} + + {!account ? ( + + ) : ( +
+ + {`${account.substring(0, 6)}...${account.substring(38)}`} + + + {chainId !== passetHub.id && ( + + )} +
+ )} +
+ ); +}; +export default WalletConnect; ``` This component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. It provides a button for users to connect their wallet and displays the connected account address once connected. @@ -1918,9 +2657,24 @@ To use this component in your dApp, replace the existing boilerplate in `app/pag ```typescript title="page.tsx" +import { useState } from "react"; +import WalletConnect from "./components/WalletConnect"; +export default function Home() { + const [account, setAccount] = useState(null); + const handleConnect = (connectedAccount: string) => { + setAccount(connectedAccount); + }; - + return ( +
+

+ Viem dApp - Passet Hub Smart Contracts +

+ +
+ ); +} ``` Now you're ready to run your dApp. From your project directory, execute: @@ -1938,7 +2692,70 @@ Navigate to `http://localhost:3000` in your browser, and you should see your dAp Now, let's create a component to read data from the contract. Create a file called `components/ReadContract.tsx`: ```typescript title="ReadContract.tsx" +'use client'; + +import React, { useState, useEffect } from 'react'; +import { publicClient } from '../utils/viem'; +import { CONTRACT_ADDRESS, CONTRACT_ABI } from '../utils/contract'; + +const ReadContract: React.FC = () => { + const [storedNumber, setStoredNumber] = useState(null); + const [loading, setLoading] = useState(true); + const [error, setError] = useState(null); + + useEffect(() => { + // Function to read data from the blockchain + const fetchData = async () => { + try { + setLoading(true); + // Call the smart contract's storedNumber function + const number = await publicClient.readContract({ + address: CONTRACT_ADDRESS, + abi: CONTRACT_ABI, + functionName: 'storedNumber', + args: [], + }) as bigint; + + setStoredNumber(number.toString()); + setError(null); + } catch (err) { + console.error('Error fetching stored number:', err); + setError('Failed to fetch data from the contract'); + } finally { + setLoading(false); + } + }; + + fetchData(); + + // Poll for updates every 10 seconds to keep UI in sync with blockchain + const interval = setInterval(fetchData, 10000); + + // Clean up interval on component unmount + return () => clearInterval(interval); + }, []); + + return ( +
+

Contract Data

+ {loading ? ( +
+
+
+ ) : error ? ( +

{error}

+ ) : ( +
+

+ Stored Number: {storedNumber} +

+
+ )} +
+ ); +}; +export default ReadContract; ``` This component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically, ensuring that the UI stays in sync with the blockchain state. @@ -1947,9 +2764,26 @@ To reflect this change in your dApp, incorporate this component into the `app/pa ```typescript title="page.tsx" +import { useState } from "react"; +import WalletConnect from "./components/WalletConnect"; +import ReadContract from "./components/ReadContract"; +export default function Home() { + const [account, setAccount] = useState(null); + const handleConnect = (connectedAccount: string) => { + setAccount(connectedAccount); + }; - + return ( +
+

+ Viem dApp - Passet Hub Smart Contracts +

+ + +
+ ); +} ``` And you will see in your browser: @@ -2185,7 +3019,31 @@ This component allows users to input a new number and send a transaction to upda Update the `app/page.tsx` file to integrate all components: ```typescript title="page.tsx" +"use client"; +import { useState } from "react"; +import WalletConnect from "./components/WalletConnect"; +import ReadContract from "./components/ReadContract"; +import WriteContract from "./components/WriteContract"; + +export default function Home() { + const [account, setAccount] = useState(null); + + const handleConnect = (connectedAccount: string) => { + setAccount(connectedAccount); + }; + + return ( +
+

+ Viem dApp - Passet Hub Smart Contracts +

+ + + +
+ ); +} ``` After that, you will see: @@ -2343,13 +3201,53 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ## Understanding the Code @@ -15009,7 +15907,16 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - + /// The full account information for a particular account ID. + #[pallet::storage] + #[pallet::getter(fn account)] + pub type Account = StorageMap< + _, + Blake2_128Concat, + T::AccountId, + AccountInfo, + ValueQuery, + >; ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -15033,7 +15940,24 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs - +/// Information of an account. +#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] +pub struct AccountInfo { + /// The number of transactions this account has sent. + pub nonce: Nonce, + /// The number of other modules that currently depend on this account's existence. The account + /// cannot be reaped until this is zero. + pub consumers: RefCount, + /// The number of other modules that allow this account to exist. The account may not be reaped + /// until this and `sufficients` are both zero. + pub providers: RefCount, + /// The number of modules that allow this account to exist for their own purposes only. The + /// account may not be reaped until this and `providers` are both zero. + pub sufficients: RefCount, + /// The additional data that belongs to this account. Used to store the balance(s) in a lot of + /// chains. + pub data: AccountData, +} ``` The `AccountInfo` structure includes the following components: @@ -16378,7 +17302,8 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust - +pub type PriceForChildParachainDelivery = + ExponentialPrice; ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. @@ -17553,7 +18478,42 @@ Let's start by setting up Hardhat for your Storage contract project: 6. Configure Hardhat by updating the `hardhat.config.js` file: ```javascript title="hardhat.config.js" - + require("@nomicfoundation/hardhat-toolbox"); + + require("@parity/hardhat-polkadot"); + + const { vars } = require("hardhat/config"); + + /** @type import('hardhat/config').HardhatUserConfig */ + module.exports = { + solidity: "0.8.28", + resolc: { + compilerSource: "npm", + }, + networks: { + hardhat: { + polkavm: true, + nodeConfig: { + nodeBinaryPath: 'INSERT_PATH_TO_SUBSTRATE_NODE', + rpcPort: 8000, + dev: true, + }, + adapterConfig: { + adapterBinaryPath: 'INSERT_PATH_TO_ETH_RPC_ADAPTER', + dev: true, + }, + }, + localNode: { + polkavm: true, + url: `http://127.0.0.1:8545`, + }, + passetHub: { + polkavm: true, + url: 'https://testnet-passet-hub-eth-rpc.polkadot.io', + accounts: [vars.get("PRIVATE_KEY")], + }, + }, + }; ``` Ensure that `INSERT_PATH_TO_SUBSTRATE_NODE` and `INSERT_PATH_TO_ETH_RPC_ADAPTER` are replaced with the proper paths to the compiled binaries. @@ -17807,7 +18767,13 @@ Testing is a critical part of smart contract development. Hardhat makes it easy 1. Create a new folder called`ignition/modules`. Add a new file named `StorageModule.js` with the following logic: ```javascript title="StorageModule.js" - + const { buildModule } = require('@nomicfoundation/hardhat-ignition/modules'); + + module.exports = buildModule('StorageModule', (m) => { + const storage = m.contract('Storage'); + + return { storage }; + }); ``` 2. Deploy to the local network: @@ -18025,19 +18991,69 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - + decl_test_parachains! { + pub struct AssetHubWestend { + genesis = genesis::genesis(), + on_init = { + asset_hub_westend_runtime::AuraExt::on_initialize(1); + }, + runtime = asset_hub_westend_runtime, + core = { + XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, + LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, + ParachainInfo: asset_hub_westend_runtime::ParachainInfo, + MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, + DigestProvider: (), + }, + pallets = { + PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, + Balances: asset_hub_westend_runtime::Balances, + Assets: asset_hub_westend_runtime::Assets, + ForeignAssets: asset_hub_westend_runtime::ForeignAssets, + PoolAssets: asset_hub_westend_runtime::PoolAssets, + AssetConversion: asset_hub_westend_runtime::AssetConversion, + SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, + Revive: asset_hub_westend_runtime::Revive, + } + }, + } ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - + decl_test_bridges! { + pub struct RococoWestendMockBridge { + source = BridgeHubRococoPara, + target = BridgeHubWestendPara, + handler = RococoWestendMessageHandler + }, + pub struct WestendRococoMockBridge { + source = BridgeHubWestendPara, + target = BridgeHubRococoPara, + handler = WestendRococoMessageHandler + } + } ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - + decl_test_networks! { + pub struct WestendMockNet { + relay_chain = Westend, + parachains = vec![ + AssetHubWestend, + BridgeHubWestend, + CollectivesWestend, + CoretimeWestend, + PeopleWestend, + PenpalA, + PenpalB, + ], + bridge = () + }, + } ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. @@ -22190,7 +23206,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust - +fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; ``` ??? interface "Input parameters" @@ -22467,7 +23483,7 @@ This API allows a dry-run of any extrinsic and obtaining the outcome if it fails This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust - +fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; ``` ??? interface "Input parameters" @@ -22698,7 +23714,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust - +fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; ``` ??? interface "Input parameters" diff --git a/.ai/pages/develop-interoperability-send-messages.md b/.ai/pages/develop-interoperability-send-messages.md index ac760f8db..57fbb8be0 100644 --- a/.ai/pages/develop-interoperability-send-messages.md +++ b/.ai/pages/develop-interoperability-send-messages.md @@ -84,7 +84,8 @@ The [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pa For instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\_blank} from the relay chain to parachains, ensuring secure and controlled communication. ```rust - +pub type PriceForChildParachainDelivery = + ExponentialPrice; ``` For more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\_blank} page. diff --git a/.ai/pages/develop-interoperability-test-and-debug.md b/.ai/pages/develop-interoperability-test-and-debug.md index 8d898ddde..311ca2f26 100644 --- a/.ai/pages/develop-interoperability-test-and-debug.md +++ b/.ai/pages/develop-interoperability-test-and-debug.md @@ -77,19 +77,69 @@ The `xcm-emulator` provides macros for defining a mocked testing environment. Ch - **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\_blank}**: Defines runtime and configuration for parachains. Example: ```rust - + decl_test_parachains! { + pub struct AssetHubWestend { + genesis = genesis::genesis(), + on_init = { + asset_hub_westend_runtime::AuraExt::on_initialize(1); + }, + runtime = asset_hub_westend_runtime, + core = { + XcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue, + LocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId, + ParachainInfo: asset_hub_westend_runtime::ParachainInfo, + MessageOrigin: cumulus_primitives_core::AggregateMessageOrigin, + DigestProvider: (), + }, + pallets = { + PolkadotXcm: asset_hub_westend_runtime::PolkadotXcm, + Balances: asset_hub_westend_runtime::Balances, + Assets: asset_hub_westend_runtime::Assets, + ForeignAssets: asset_hub_westend_runtime::ForeignAssets, + PoolAssets: asset_hub_westend_runtime::PoolAssets, + AssetConversion: asset_hub_westend_runtime::AssetConversion, + SnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend, + Revive: asset_hub_westend_runtime::Revive, + } + }, + } ``` - **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example: ```rust - + decl_test_bridges! { + pub struct RococoWestendMockBridge { + source = BridgeHubRococoPara, + target = BridgeHubWestendPara, + handler = RococoWestendMessageHandler + }, + pub struct WestendRococoMockBridge { + source = BridgeHubWestendPara, + target = BridgeHubRococoPara, + handler = WestendRococoMessageHandler + } + } ``` - **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example: ```rust - + decl_test_networks! { + pub struct WestendMockNet { + relay_chain = Westend, + parachains = vec![ + AssetHubWestend, + BridgeHubWestend, + CollectivesWestend, + CoretimeWestend, + PeopleWestend, + PenpalA, + PenpalB, + ], + bridge = () + }, + } ``` By leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\_blank} article. diff --git a/.ai/pages/develop-interoperability-xcm-runtime-apis.md b/.ai/pages/develop-interoperability-xcm-runtime-apis.md index 6d761bebd..0b676f353 100644 --- a/.ai/pages/develop-interoperability-xcm-runtime-apis.md +++ b/.ai/pages/develop-interoperability-xcm-runtime-apis.md @@ -38,7 +38,7 @@ This API can be used independently for dry-running, double-checking, or testing. This API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains. ```rust - +fn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>; ``` ??? interface "Input parameters" @@ -315,7 +315,7 @@ This API allows a dry-run of any extrinsic and obtaining the outcome if it fails This API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains. ```rust - +fn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>; ``` ??? interface "Input parameters" @@ -546,7 +546,7 @@ To use the API effectively, the client must already know the XCM program to be e Retrieves the list of assets that are acceptable for paying fees when using a specific XCM version ```rust - +fn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>; ``` ??? interface "Input parameters" diff --git a/.ai/pages/develop-parachains-deployment-build-deterministic-runtime.md b/.ai/pages/develop-parachains-deployment-build-deterministic-runtime.md index 1f9b43dbb..b95195847 100644 --- a/.ai/pages/develop-parachains-deployment-build-deterministic-runtime.md +++ b/.ai/pages/develop-parachains-deployment-build-deterministic-runtime.md @@ -107,7 +107,29 @@ To add a GitHub workflow for building the runtime: {% raw %} ```yml - + name: Srtool build + + on: push + + jobs: + srtool: + runs-on: ubuntu-latest + strategy: + matrix: + chain: ["asset-hub-kusama", "asset-hub-westend"] + steps: + - uses: actions/checkout@v3 + - name: Srtool build + id: srtool_build + uses: chevdor/srtool-actions@v0.8.0 + with: + chain: ${{ matrix.chain }} + runtime_dir: polkadot-parachains/${{ matrix.chain }}-runtime + - name: Summary + run: | + echo '${{ steps.srtool_build.outputs.json }}' | jq . > ${{ matrix.chain }}-srtool-digest.json + cat ${{ matrix.chain }}-srtool-digest.json + echo "Runtime location: ${{ steps.srtool_build.outputs.wasm }}" ``` {% endraw %} diff --git a/.ai/pages/develop-parachains-maintenance-storage-migrations.md b/.ai/pages/develop-parachains-maintenance-storage-migrations.md index ded3b70e9..60d3fe6a8 100644 --- a/.ai/pages/develop-parachains-maintenance-storage-migrations.md +++ b/.ai/pages/develop-parachains-maintenance-storage-migrations.md @@ -170,7 +170,111 @@ Examine the following migration example that transforms a simple `StorageValue` - Migration: ```rust - + use frame_support::{ + storage_alias, + traits::{Get, UncheckedOnRuntimeUpgrade}, + }; + + #[cfg(feature = "try-runtime")] + use alloc::vec::Vec; + + /// Collection of storage item formats from the previous storage version. + /// + /// Required so we can read values in the v0 storage format during the migration. + mod v0 { + use super::*; + + /// V0 type for [`crate::Value`]. + #[storage_alias] + pub type Value = StorageValue, u32>; + } + + /// Implements [`UncheckedOnRuntimeUpgrade`], migrating the state of this pallet from V0 to V1. + /// + /// In V0 of the template [`crate::Value`] is just a `u32`. In V1, it has been upgraded to + /// contain the struct [`crate::CurrentAndPreviousValue`]. + /// + /// In this migration, update the on-chain storage for the pallet to reflect the new storage + /// layout. + pub struct InnerMigrateV0ToV1(core::marker::PhantomData); + + impl UncheckedOnRuntimeUpgrade for InnerMigrateV0ToV1 { + /// Return the existing [`crate::Value`] so we can check that it was correctly set in + /// `InnerMigrateV0ToV1::post_upgrade`. + #[cfg(feature = "try-runtime")] + fn pre_upgrade() -> Result, sp_runtime::TryRuntimeError> { + use codec::Encode; + + // Access the old value using the `storage_alias` type + let old_value = v0::Value::::get(); + // Return it as an encoded `Vec` + Ok(old_value.encode()) + } + + /// Migrate the storage from V0 to V1. + /// + /// - If the value doesn't exist, there is nothing to do. + /// - If the value exists, it is read and then written back to storage inside a + /// [`crate::CurrentAndPreviousValue`]. + fn on_runtime_upgrade() -> frame_support::weights::Weight { + // Read the old value from storage + if let Some(old_value) = v0::Value::::take() { + // Write the new value to storage + let new = crate::CurrentAndPreviousValue { current: old_value, previous: None }; + crate::Value::::put(new); + // One read + write for taking the old value, and one write for setting the new value + T::DbWeight::get().reads_writes(1, 2) + } else { + // No writes since there was no old value, just one read for checking + T::DbWeight::get().reads(1) + } + } + + /// Verifies the storage was migrated correctly. + /// + /// - If there was no old value, the new value should not be set. + /// - If there was an old value, the new value should be a [`crate::CurrentAndPreviousValue`]. + #[cfg(feature = "try-runtime")] + fn post_upgrade(state: Vec) -> Result<(), sp_runtime::TryRuntimeError> { + use codec::Decode; + use frame_support::ensure; + + let maybe_old_value = Option::::decode(&mut &state[..]).map_err(|_| { + sp_runtime::TryRuntimeError::Other("Failed to decode old value from storage") + })?; + + match maybe_old_value { + Some(old_value) => { + let expected_new_value = + crate::CurrentAndPreviousValue { current: old_value, previous: None }; + let actual_new_value = crate::Value::::get(); + + ensure!(actual_new_value.is_some(), "New value not set"); + ensure!( + actual_new_value == Some(expected_new_value), + "New value not set correctly" + ); + }, + None => { + ensure!(crate::Value::::get().is_none(), "New value unexpectedly set"); + }, + }; + Ok(()) + } + } + + /// [`UncheckedOnRuntimeUpgrade`] implementation [`InnerMigrateV0ToV1`] wrapped in a + /// [`VersionedMigration`](frame_support::migrations::VersionedMigration), which ensures that: + /// - The migration only runs once when the on-chain storage version is 0 + /// - The on-chain storage version is updated to `1` after the migration executes + /// - Reads/Writes from checking/settings the on-chain storage version are accounted for + pub type MigrateV0ToV1 = frame_support::migrations::VersionedMigration< + 0, // The migration will only execute when the on-chain storage version is 0 + 1, // The on-chain storage version will be set to 1 after the migration is complete + InnerMigrateV0ToV1, + crate::pallet::Pallet, + ::DbWeight, + >; ``` ### Migration Organization diff --git a/.ai/pages/develop-parachains-testing-benchmarking.md b/.ai/pages/develop-parachains-testing-benchmarking.md index b56d8b6b3..0cff99c05 100644 --- a/.ai/pages/develop-parachains-testing-benchmarking.md +++ b/.ai/pages/develop-parachains-testing-benchmarking.md @@ -104,7 +104,40 @@ my-pallet/ With the directory structure set, you can use the [`polkadot-sdk-parachain-template`](https://github.com/paritytech/polkadot-sdk-parachain-template/tree/master/pallets){target=\_blank} to get started as follows: ```rust title="benchmarking.rs (starter template)" - +//! Benchmarking setup for pallet-template +#![cfg(feature = "runtime-benchmarks")] + +use super::*; +use frame_benchmarking::v2::*; + +#[benchmarks] +mod benchmarks { + use super::*; + #[cfg(test)] + use crate::pallet::Pallet as Template; + use frame_system::RawOrigin; + + #[benchmark] + fn do_something() { + let caller: T::AccountId = whitelisted_caller(); + #[extrinsic_call] + do_something(RawOrigin::Signed(caller), 100); + + assert_eq!(Something::::get().map(|v| v.block_number), Some(100u32.into())); + } + + #[benchmark] + fn cause_error() { + Something::::put(CompositeStruct { block_number: 100u32.into() }); + let caller: T::AccountId = whitelisted_caller(); + #[extrinsic_call] + cause_error(RawOrigin::Signed(caller)); + + assert_eq!(Something::::get().map(|v| v.block_number), Some(101u32.into())); + } + + impl_benchmark_test_suite!(Template, crate::mock::new_test_ext(), crate::mock::Test); +} ``` In your benchmarking tests, employ these best practices: diff --git a/.ai/pages/develop-smart-contracts-precompiles-interact-with-precompiles.md b/.ai/pages/develop-smart-contracts-precompiles-interact-with-precompiles.md index 97dd3b443..2c14630c4 100644 --- a/.ai/pages/develop-smart-contracts-precompiles-interact-with-precompiles.md +++ b/.ai/pages/develop-smart-contracts-precompiles-interact-with-precompiles.md @@ -81,7 +81,30 @@ To interact with the ECRecover precompile, you can deploy the `ECRecoverExample` The SHA-256 precompile computes the SHA-256 hash of the input data. ```solidity title="SHA256.sol" +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.0; + +contract SHA256Example { + event SHA256Called(bytes result); + + // Address of the SHA256 precompile + address constant SHA256_PRECOMPILE = address(0x02); + + bytes public result; + + function callH256(bytes calldata input) public { + bool success; + bytes memory resultInMemory; + + (success, resultInMemory) = SHA256_PRECOMPILE.call{value: 0}(input); + + if (success) { + emit SHA256Called(resultInMemory); + } + result = resultInMemory; + } +} ``` To use it, you can deploy the `SHA256Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call callH256 with arbitrary bytes. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/SHA256.js){target=\_blank} shows how to pass a UTF-8 string, hash it using the precompile, and compare it with the expected hash from Node.js's [crypto](https://www.npmjs.com/package/crypto-js){target=\_blank} module. @@ -198,7 +221,38 @@ To use it, you can deploy the `ModExpExample` contract in [Remix](/develop/smart The BN128Add precompile performs addition on the alt_bn128 elliptic curve, which is essential for zk-SNARK operations. ```solidity title="BN128Add.sol" +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.20; + +contract BN128AddExample { + address constant BN128_ADD_PRECOMPILE = address(0x06); + event BN128Added(uint256 x3, uint256 y3); + + uint256 public resultX; + uint256 public resultY; + + function callBN128Add(uint256 x1, uint256 y1, uint256 x2, uint256 y2) public { + bytes memory input = abi.encodePacked( + bytes32(x1), bytes32(y1), bytes32(x2), bytes32(y2) + ); + + bool success; + bytes memory output; + + (success, output) = BN128_ADD_PRECOMPILE.call{value: 0}(input); + + require(success, "BN128Add precompile call failed"); + require(output.length == 64, "Invalid output length"); + + (uint256 x3, uint256 y3) = abi.decode(output, (uint256, uint256)); + + resultX = x3; + resultY = y3; + + emit BN128Added(x3, y3); + } +} ``` To use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `callBN128Add` with valid `alt_bn128` points. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Add.js){target=\_blank} demonstrates a valid curve addition and checks the result against known expected values. @@ -208,7 +262,42 @@ To use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/sma The BN128Mul precompile performs scalar multiplication on the alt_bn128 curve. ```solidity title="BN128Mul.sol" +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.0; + +contract BN128MulExample { + // Precompile address for BN128Mul + address constant BN128_MUL_ADDRESS = address(0x07); + + bytes public result; + // Performs scalar multiplication of a point on the alt_bn128 curve + function bn128ScalarMul(uint256 x1, uint256 y1, uint256 scalar) public { + // Format: [x, y, scalar] - each 32 bytes + bytes memory input = abi.encodePacked( + bytes32(x1), + bytes32(y1), + bytes32(scalar) + ); + + (bool success, bytes memory resultInMemory) = BN128_MUL_ADDRESS.call{ + value: 0 + }(input); + require(success, "BN128Mul precompile call failed"); + + result = resultInMemory; + } + + // Helper to decode result from `result` storage + function getResult() public view returns (uint256 x2, uint256 y2) { + bytes memory tempResult = result; + require(tempResult.length >= 64, "Invalid result length"); + assembly { + x2 := mload(add(tempResult, 32)) + y2 := mload(add(tempResult, 64)) + } + } +} ``` To use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `bn128ScalarMul` with a valid point and scalar. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Mul.js){target=\_blank} shows how to test the operation and verify the expected scalar multiplication result on `alt_bn128`. @@ -218,7 +307,38 @@ To use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-envi The BN128Pairing precompile verifies a pairing equation on the alt_bn128 curve, which is critical for zk-SNARK verification. ```solidity title="BN128Pairing.sol" +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.0; + +contract BN128PairingExample { + // Precompile address for BN128Pairing + address constant BN128_PAIRING_ADDRESS = address(0x08); + + bytes public result; + + // Performs a pairing check on the alt_bn128 curve + function bn128Pairing(bytes memory input) public { + // Call the precompile + (bool success, bytes memory resultInMemory) = BN128_PAIRING_ADDRESS + .call{value: 0}(input); + require(success, "BN128Pairing precompile call failed"); + result = resultInMemory; + } + + // Helper function to decode the result from `result` storage + function getResult() public view returns (bool isValid) { + bytes memory tempResult = result; + require(tempResult.length == 32, "Invalid result length"); + + uint256 output; + assembly { + output := mload(add(tempResult, 32)) + } + + isValid = (output == 1); + } +} ``` You can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or your preferred environment. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Pairing.js){target=\_blank} contains these tests with working examples. @@ -228,7 +348,105 @@ You can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-env The Blake2F precompile performs the Blake2 compression function F, which is the core of the Blake2 hash function. ```solidity title="Blake2F.sol" +// SPDX-License-Identifier: MIT +pragma solidity ^0.8.0; + +contract Blake2FExample { + // Precompile address for Blake2F + address constant BLAKE2F_ADDRESS = address(0x09); + + bytes public result; + + function blake2F(bytes memory input) public { + // Input must be exactly 213 bytes + require(input.length == 213, "Invalid input length - must be 213 bytes"); + + // Call the precompile + (bool success, bytes memory resultInMemory) = BLAKE2F_ADDRESS.call{ + value: 0 + }(input); + require(success, "Blake2F precompile call failed"); + + result = resultInMemory; + } + + // Helper function to decode the result from `result` storage + function getResult() public view returns (bytes32[8] memory output) { + bytes memory tempResult = result; + require(tempResult.length == 64, "Invalid result length"); + + for (uint i = 0; i < 8; i++) { + assembly { + mstore(add(output, mul(32, i)), mload(add(add(tempResult, 32), mul(32, i)))) + } + } + } + + + // Helper function to create Blake2F input from parameters + function createBlake2FInput( + uint32 rounds, + bytes32[8] memory h, + bytes32[16] memory m, + bytes8[2] memory t, + bool f + ) public pure returns (bytes memory) { + // Start with rounds (4 bytes, big-endian) + bytes memory input = abi.encodePacked(rounds); + + // Add state vector h (8 * 32 = 256 bytes) + for (uint i = 0; i < 8; i++) { + input = abi.encodePacked(input, h[i]); + } + + // Add message block m (16 * 32 = 512 bytes, but we need to convert to 16 * 8 = 128 bytes) + // Blake2F expects 64-bit words in little-endian format + for (uint i = 0; i < 16; i++) { + // Take only the first 8 bytes of each bytes32 and reverse for little-endian + bytes8 word = bytes8(m[i]); + input = abi.encodePacked(input, word); + } + + // Add offset counters t (2 * 8 = 16 bytes) + input = abi.encodePacked(input, t[0], t[1]); + // Add final block flag (1 byte) + input = abi.encodePacked(input, f ? bytes1(0x01) : bytes1(0x00)); + + return input; + } + + // Simplified function that works with raw hex input + function blake2FFromHex(string memory hexInput) public { + bytes memory input = hexStringToBytes(hexInput); + blake2F(input); + } + + // Helper function to convert hex string to bytes + function hexStringToBytes(string memory hexString) public pure returns (bytes memory) { + bytes memory hexBytes = bytes(hexString); + require(hexBytes.length % 2 == 0, "Invalid hex string length"); + + bytes memory result = new bytes(hexBytes.length / 2); + + for (uint i = 0; i < hexBytes.length / 2; i++) { + result[i] = bytes1( + (hexCharToByte(hexBytes[2 * i]) << 4) | + hexCharToByte(hexBytes[2 * i + 1]) + ); + } + + return result; + } + + function hexCharToByte(bytes1 char) internal pure returns (uint8) { + uint8 c = uint8(char); + if (c >= 48 && c <= 57) return c - 48; // 0-9 + if (c >= 65 && c <= 70) return c - 55; // A-F + if (c >= 97 && c <= 102) return c - 87; // a-f + revert("Invalid hex character"); + } +} ``` To use it, deploy `Blake2FExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\_blank} or any Solidity-compatible environment and call `callBlake2F` with the properly formatted input parameters for rounds, state vector, message block, offset counters, and final block flag. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Blake2.js){target=\_blank} demonstrates how to perform Blake2 compression with different rounds and verify the correctness of the output against known test vectors. diff --git a/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding-start-validating.md b/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding-start-validating.md index a0bdcce8b..61652dd5d 100644 --- a/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding-start-validating.md +++ b/.ai/pages/infrastructure-running-a-validator-onboarding-and-offboarding-start-validating.md @@ -181,7 +181,44 @@ touch /etc/systemd/system/polkadot-validator.service In this unit file, you will write the commands that you want to run on server boot/restart: ```systemd title="/etc/systemd/system/polkadot-validator.service" - +[Unit] +Description=Polkadot Node +After=network.target +Documentation=https://github.com/paritytech/polkadot-sdk + +[Service] +EnvironmentFile=-/etc/default/polkadot +ExecStart=/usr/bin/polkadot $POLKADOT_CLI_ARGS +User=polkadot +Group=polkadot +Restart=always +RestartSec=120 +CapabilityBoundingSet= +LockPersonality=true +NoNewPrivileges=true +PrivateDevices=true +PrivateMounts=true +PrivateTmp=true +PrivateUsers=true +ProtectClock=true +ProtectControlGroups=true +ProtectHostname=true +ProtectKernelModules=true +ProtectKernelTunables=true +ProtectSystem=strict +RemoveIPC=true +RestrictAddressFamilies=AF_INET AF_INET6 AF_NETLINK AF_UNIX +RestrictNamespaces=false +RestrictSUIDSGID=true +SystemCallArchitectures=native +SystemCallFilter=@system-service +SystemCallFilter=landlock_add_rule landlock_create_ruleset landlock_restrict_self seccomp mount umount2 +SystemCallFilter=~@clock @module @reboot @swap @privileged +SystemCallFilter=pivot_root +UMask=0027 + +[Install] +WantedBy=multi-user.target ``` !!! warning "Restart delay and equivocation risk" diff --git a/.ai/pages/polkadot-protocol-parachain-basics-accounts.md b/.ai/pages/polkadot-protocol-parachain-basics-accounts.md index f5e3bc727..80d2f595f 100644 --- a/.ai/pages/polkadot-protocol-parachain-basics-accounts.md +++ b/.ai/pages/polkadot-protocol-parachain-basics-accounts.md @@ -24,7 +24,16 @@ The [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame The code snippet below shows how accounts are defined: ```rs - + /// The full account information for a particular account ID. + #[pallet::storage] + #[pallet::getter(fn account)] + pub type Account = StorageMap< + _, + Blake2_128Concat, + T::AccountId, + AccountInfo, + ValueQuery, + >; ``` The preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`). @@ -48,7 +57,24 @@ For a detailed explanation of storage maps, see the [`StorageMap`](https://parit The `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules. ```rs - +/// Information of an account. +#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)] +pub struct AccountInfo { + /// The number of transactions this account has sent. + pub nonce: Nonce, + /// The number of other modules that currently depend on this account's existence. The account + /// cannot be reaped until this is zero. + pub consumers: RefCount, + /// The number of other modules that allow this account to exist. The account may not be reaped + /// until this and `sufficients` are both zero. + pub providers: RefCount, + /// The number of modules that allow this account to exist for their own purposes only. The + /// account may not be reaped until this and `providers` are both zero. + pub sufficients: RefCount, + /// The additional data that belongs to this account. Used to store the balance(s) in a lot of + /// chains. + pub data: AccountData, +} ``` The `AccountInfo` structure includes the following components: diff --git a/.ai/pages/tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime.md b/.ai/pages/tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime.md index 08d62300c..066fd3fbf 100644 --- a/.ai/pages/tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime.md +++ b/.ai/pages/tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime.md @@ -20,7 +20,7 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line: ```toml hl_lines="4" title="runtime/Cargo.toml" - + [dependencies] ... polkadot-sdk = { workspace = true, features = [ "pallet-utility", @@ -39,11 +39,9 @@ First, you'll update the runtime's `Cargo.toml` file to include the Utility pall 3. In the `[features]` section, add the custom pallet to the `std` feature list: ```toml hl_lines="5" title="Cargo.toml" - [features] - default = ["std"] - std = [ + ... - "custom-pallet/std", + ... ] ``` @@ -167,13 +165,63 @@ Update your root parachain template's `Cargo.toml` file to include your custom p Make sure the `custom-pallet` is a member of the workspace: ```toml hl_lines="4" title="Cargo.toml" - + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] ``` ???- code "./Cargo.toml" ```rust title="./Cargo.toml" - + [workspace.package] + license = "MIT-0" + authors = ["Parity Technologies "] + homepage = "https://paritytech.github.io/polkadot-sdk/" + repository = "https://github.com/paritytech/polkadot-sdk-parachain-template.git" + edition = "2021" + + [workspace] + default-members = ["pallets/template", "runtime"] + members = [ + "node", "pallets/custom-pallet", + "pallets/template", + "runtime", + ] + resolver = "2" + + [workspace.dependencies] + parachain-template-runtime = { path = "./runtime", default-features = false } + pallet-parachain-template = { path = "./pallets/template", default-features = false } + clap = { version = "4.5.13" } + color-print = { version = "0.3.4" } + docify = { version = "0.2.9" } + futures = { version = "0.3.31" } + jsonrpsee = { version = "0.24.3" } + log = { version = "0.4.22", default-features = false } + polkadot-sdk = { version = "2503.0.1", default-features = false } + prometheus-endpoint = { version = "0.17.2", default-features = false, package = "substrate-prometheus-endpoint" } + serde = { version = "1.0.214", default-features = false } + codec = { version = "3.7.4", default-features = false, package = "parity-scale-codec" } + cumulus-pallet-parachain-system = { version = "0.20.0", default-features = false } + hex-literal = { version = "0.4.1", default-features = false } + scale-info = { version = "2.11.6", default-features = false } + serde_json = { version = "1.0.132", default-features = false } + smallvec = { version = "1.11.0", default-features = false } + substrate-wasm-builder = { version = "26.0.1", default-features = false } + frame = { version = "0.9.1", default-features = false, package = "polkadot-sdk-frame" } + + [profile.release] + opt-level = 3 + panic = "unwind" + + [profile.production] + codegen-units = 1 + inherits = "release" + lto = true ``` diff --git a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-contracts.md b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-contracts.md index d44254595..48031eb9f 100644 --- a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-contracts.md +++ b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-contracts.md @@ -94,13 +94,53 @@ To build the smart contract, follow the steps below: 6. Add the getter and setter functions: ```solidity - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ??? code "Complete Storage.sol contract" ```solidity title="Storage.sol" - + // SPDX-License-Identifier: MIT + pragma solidity ^0.8.28; + + contract Storage { + // State variable to store our number + uint256 private number; + + // Event to notify when the number changes + event NumberChanged(uint256 newNumber); + + // Function to store a new number + function store(uint256 newNumber) public { + number = newNumber; + emit NumberChanged(newNumber); + } + + // Function to retrieve the stored number + function retrieve() public view returns (uint256) { + return number; + } + } ``` ## Understanding the Code diff --git a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js.md b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js.md index e17d3924d..961facefa 100644 --- a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js.md +++ b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js.md @@ -77,7 +77,31 @@ npm install ethers@6.13.5 To interact with the Polkadot Hub, you need to set up an [Ethers.js Provider](/develop/smart-contracts/libraries/ethers-js/#set-up-the-ethersjs-provider){target=\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/ethers.js` and add the following code: ```javascript title="app/utils/ethers.js" - +import { JsonRpcProvider } from 'ethers'; + +export const PASSET_HUB_CONFIG = { + name: 'Passet Hub', + rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io/', // Passet Hub testnet RPC + chainId: 420420422, // Passet Hub testnet chainId + blockExplorer: 'https://blockscout-passet-hub.parity-testnet.parity.io/', +}; + +export const getProvider = () => { + return new JsonRpcProvider(PASSET_HUB_CONFIG.rpc, { + chainId: PASSET_HUB_CONFIG.chainId, + name: PASSET_HUB_CONFIG.name, + }); +}; + +// Helper to get a signer from a provider +export const getSigner = async (provider) => { + if (window.ethereum) { + await window.ethereum.request({ method: 'eth_requestAccounts' }); + const ethersProvider = new ethers.BrowserProvider(window.ethereum); + return ethersProvider.getSigner(); + } + throw new Error('No Ethereum browser provider detected'); +}; ``` This file establishes a connection to the Polkadot Hub TestNet and provides helper functions for obtaining a [Provider](https://docs.ethers.org/v5/api/providers/provider/){target=_blank} and [Signer](https://docs.ethers.org/v5/api/signer/){target=_blank}. The provider allows you to read data from the blockchain, while the signer enables users to send transactions and modify the blockchain state. @@ -89,13 +113,55 @@ For this dApp, you'll use a simple Storage contract already deployed. So, you ne ???+ code "Storage.sol ABI" ```json title="abis/Storage.json" - + [ + { + "inputs": [ + { + "internalType": "uint256", + "name": "_newNumber", + "type": "uint256" + } + ], + "name": "setNumber", + "outputs": [], + "stateMutability": "nonpayable", + "type": "function" + }, + { + "inputs": [], + "name": "storedNumber", + "outputs": [ + { + "internalType": "uint256", + "name": "", + "type": "uint256" + } + ], + "stateMutability": "view", + "type": "function" + } + ] ``` Now, create a file called `app/utils/contract.js`: ```javascript title="app/utils/contract.js" +import { Contract } from 'ethers'; +import { getProvider } from './ethers'; +import StorageABI from '../../abis/Storage.json'; + +export const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f'; + +export const CONTRACT_ABI = StorageABI; + +export const getContract = () => { + const provider = getProvider(); + return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, provider); +}; +export const getSignedContract = async (signer) => { + return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, signer); +}; ``` This file defines the contract address, ABI, and functions to create instances of the contract for reading and writing. @@ -105,7 +171,167 @@ This file defines the contract address, ABI, and functions to create instances o Next, let's create a component to handle wallet connections. Create a new file called `app/components/WalletConnect.js`: ```javascript title="app/components/WalletConnect.js" - +'use client'; + +import React, { useState, useEffect } from 'react'; +import { PASSET_HUB_CONFIG } from '../utils/ethers'; + +const WalletConnect = ({ onConnect }) => { + const [account, setAccount] = useState(null); + const [chainId, setChainId] = useState(null); + const [error, setError] = useState(null); + + useEffect(() => { + // Check if user already has an authorized wallet connection + const checkConnection = async () => { + if (window.ethereum) { + try { + // eth_accounts doesn't trigger the wallet popup + const accounts = await window.ethereum.request({ + method: 'eth_accounts', + }); + if (accounts.length > 0) { + setAccount(accounts[0]); + const chainIdHex = await window.ethereum.request({ + method: 'eth_chainId', + }); + setChainId(parseInt(chainIdHex, 16)); + } + } catch (err) { + console.error('Error checking connection:', err); + setError('Failed to check wallet connection'); + } + } + }; + + checkConnection(); + + if (window.ethereum) { + // Setup wallet event listeners + window.ethereum.on('accountsChanged', (accounts) => { + setAccount(accounts[0] || null); + if (accounts[0] && onConnect) onConnect(accounts[0]); + }); + + window.ethereum.on('chainChanged', (chainIdHex) => { + setChainId(parseInt(chainIdHex, 16)); + }); + } + + return () => { + // Cleanup event listeners + if (window.ethereum) { + window.ethereum.removeListener('accountsChanged', () => {}); + window.ethereum.removeListener('chainChanged', () => {}); + } + }; + }, [onConnect]); + + const connectWallet = async () => { + if (!window.ethereum) { + setError( + 'MetaMask not detected! Please install MetaMask to use this dApp.' + ); + return; + } + + try { + // eth_requestAccounts triggers the wallet popup + const accounts = await window.ethereum.request({ + method: 'eth_requestAccounts', + }); + setAccount(accounts[0]); + + const chainIdHex = await window.ethereum.request({ + method: 'eth_chainId', + }); + const currentChainId = parseInt(chainIdHex, 16); + setChainId(currentChainId); + + // Prompt user to switch networks if needed + if (currentChainId !== PASSET_HUB_CONFIG.chainId) { + await switchNetwork(); + } + + if (onConnect) onConnect(accounts[0]); + } catch (err) { + console.error('Error connecting to wallet:', err); + setError('Failed to connect wallet'); + } + }; + + const switchNetwork = async () => { + try { + await window.ethereum.request({ + method: 'wallet_switchEthereumChain', + params: [{ chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}` }], + }); + } catch (switchError) { + // Error 4902 means the chain hasn't been added to MetaMask + if (switchError.code === 4902) { + try { + await window.ethereum.request({ + method: 'wallet_addEthereumChain', + params: [ + { + chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}`, + chainName: PASSET_HUB_CONFIG.name, + rpcUrls: [PASSET_HUB_CONFIG.rpc], + blockExplorerUrls: [PASSET_HUB_CONFIG.blockExplorer], + }, + ], + }); + } catch (addError) { + setError('Failed to add network to wallet'); + } + } else { + setError('Failed to switch network'); + } + } + }; + + // UI-only disconnection - MetaMask doesn't support programmatic disconnection + const disconnectWallet = () => { + setAccount(null); + }; + + return ( +
+ {error &&

{error}

} + + {!account ? ( + + ) : ( +
+ + {`${account.substring(0, 6)}...${account.substring(38)}`} + + + {chainId !== PASSET_HUB_CONFIG.chainId && ( + + )} +
+ )} +
+ ); +}; + +export default WalletConnect; ``` This component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. @@ -114,9 +340,25 @@ To integrate this component to your dApp, you need to overwrite the existing boi ```javascript title="app/page.js" - - - +import { useState } from 'react'; + +import WalletConnect from './components/WalletConnect'; +export default function Home() { + const [account, setAccount] = useState(null); + + const handleConnect = (connectedAccount) => { + setAccount(connectedAccount); + }; + + return ( +
+

+ Ethers.js dApp - Passet Hub Smart Contracts +

+ +
+ ); +} ``` In your terminal, you can launch your project by running: @@ -134,7 +376,64 @@ And you will see the following: Now, let's create a component to read data from the contract. Create a file called `app/components/ReadContract.js`: ```javascript title="app/components/ReadContract.js" - +'use client'; + +import React, { useState, useEffect } from 'react'; +import { getContract } from '../utils/contract'; + +const ReadContract = () => { + const [storedNumber, setStoredNumber] = useState(null); + const [loading, setLoading] = useState(true); + const [error, setError] = useState(null); + + useEffect(() => { + // Function to read data from the blockchain + const fetchData = async () => { + try { + setLoading(true); + const contract = getContract(); + // Call the smart contract's storedNumber function + const number = await contract.storedNumber(); + setStoredNumber(number.toString()); + setError(null); + } catch (err) { + console.error('Error fetching stored number:', err); + setError('Failed to fetch data from the contract'); + } finally { + setLoading(false); + } + }; + + fetchData(); + + // Poll for updates every 10 seconds to keep UI in sync with blockchain + const interval = setInterval(fetchData, 10000); + + // Clean up interval on component unmount + return () => clearInterval(interval); + }, []); + + return ( +
+

Contract Data

+ {loading ? ( +
+
+
+ ) : error ? ( +

{error}

+ ) : ( +
+

+ Stored Number: {storedNumber} +

+
+ )} +
+ ); +}; + +export default ReadContract; ``` This component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically. @@ -143,9 +442,27 @@ To see this change in your dApp, you need to integrate this component into the ` ```javascript title="app/page.js" - - - +import { useState } from 'react'; + +import WalletConnect from './components/WalletConnect'; +import ReadContract from './components/ReadContract'; +export default function Home() { + const [account, setAccount] = useState(null); + + const handleConnect = (connectedAccount) => { + setAccount(connectedAccount); + }; + + return ( +
+

+ Ethers.js dApp - Passet Hub Smart Contracts +

+ + +
+ ); +} ``` Your dApp will automatically be updated to the following: @@ -157,7 +474,119 @@ Your dApp will automatically be updated to the following: Finally, let's create a component that allows users to update the stored number. Create a file called `app/components/WriteContract.js`: ```javascript title="app/components/WriteContract.js" - +'use client'; + +import { useState } from 'react'; +import { getSignedContract } from '../utils/contract'; +import { ethers } from 'ethers'; + +const WriteContract = ({ account }) => { + const [newNumber, setNewNumber] = useState(''); + const [status, setStatus] = useState({ type: null, message: '' }); + const [isSubmitting, setIsSubmitting] = useState(false); + + const handleSubmit = async (e) => { + e.preventDefault(); + + // Validation checks + if (!account) { + setStatus({ type: 'error', message: 'Please connect your wallet first' }); + return; + } + + if (!newNumber || isNaN(Number(newNumber))) { + setStatus({ type: 'error', message: 'Please enter a valid number' }); + return; + } + + try { + setIsSubmitting(true); + setStatus({ type: 'info', message: 'Initiating transaction...' }); + + // Get a signer from the connected wallet + const provider = new ethers.BrowserProvider(window.ethereum); + const signer = await provider.getSigner(); + const contract = await getSignedContract(signer); + + // Send transaction to blockchain and wait for user confirmation in wallet + setStatus({ + type: 'info', + message: 'Please confirm the transaction in your wallet...', + }); + + // Call the contract's setNumber function + const tx = await contract.setNumber(newNumber); + + // Wait for transaction to be mined + setStatus({ + type: 'info', + message: 'Transaction submitted. Waiting for confirmation...', + }); + const receipt = await tx.wait(); + + setStatus({ + type: 'success', + message: `Transaction confirmed! Transaction hash: ${receipt.hash}`, + }); + setNewNumber(''); + } catch (err) { + console.error('Error updating number:', err); + + // Error code 4001 is MetaMask's code for user rejection + if (err.code === 4001) { + setStatus({ type: 'error', message: 'Transaction rejected by user.' }); + } else { + setStatus({ + type: 'error', + message: `Error: ${err.message || 'Failed to send transaction'}`, + }); + } + } finally { + setIsSubmitting(false); + } + }; + + return ( +
+

Update Stored Number

+ {status.message && ( +
+ {status.message} +
+ )} +
+ setNewNumber(e.target.value)} + disabled={isSubmitting || !account} + className="w-full p-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-pink-400" + /> + +
+ {!account && ( +

+ Connect your wallet to update the stored number. +

+ )} +
+ ); +}; + +export default WriteContract; ``` This component allows users to input a new number and send a transaction to update the value stored in the contract. When the transaction is successful, users will see the stored value update in the `ReadContract` component after the transaction is confirmed. @@ -165,7 +594,32 @@ This component allows users to input a new number and send a transaction to upda Update the `app/page.js` file to integrate all components: ```javascript title="app/page.js" - +'use client'; + +import { useState } from 'react'; + +import WalletConnect from './components/WalletConnect'; +import ReadContract from './components/ReadContract'; +import WriteContract from './components/WriteContract'; + +export default function Home() { + const [account, setAccount] = useState(null); + + const handleConnect = (connectedAccount) => { + setAccount(connectedAccount); + }; + + return ( +
+

+ Ethers.js dApp - Passet Hub Smart Contracts +

+ + + +
+ ); +} ``` The completed UI will display: diff --git a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-viem.md b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-viem.md index a208400f4..ba55872d5 100644 --- a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-viem.md +++ b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-create-dapp-viem.md @@ -77,7 +77,47 @@ npm install --save-dev typescript @types/node To interact with Polkadot Hub, you need to set up a [Public Client](https://viem.sh/docs/clients/public#public-client){target=\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/viem.ts` and add the following code: ```typescript title="viem.ts" - +import { createPublicClient, http, createWalletClient, custom } from 'viem' +import 'viem/window'; + + +const transport = http('https://testnet-passet-hub-eth-rpc.polkadot.io') + +// Configure the Passet Hub chain +export const passetHub = { + id: 420420422, + name: 'Passet Hub', + network: 'passet-hub', + nativeCurrency: { + decimals: 18, + name: 'PAS', + symbol: 'PAS', + }, + rpcUrls: { + default: { + http: ['https://testnet-passet-hub-eth-rpc.polkadot.io'], + }, + }, +} as const + +// Create a public client for reading data +export const publicClient = createPublicClient({ + chain: passetHub, + transport +}) + +// Create a wallet client for signing transactions +export const getWalletClient = async () => { + if (typeof window !== 'undefined' && window.ethereum) { + const [account] = await window.ethereum.request({ method: 'eth_requestAccounts' }); + return createWalletClient({ + chain: passetHub, + transport: custom(window.ethereum), + account, + }); + } + throw new Error('No Ethereum browser provider detected'); +}; ``` This file initializes a viem client, providing helper functions for obtaining a Public Client and a [Wallet Client](https://viem.sh/docs/clients/wallet#wallet-client){target=\_blank}. The Public Client enables reading blockchain data, while the Wallet Client allows users to sign and send transactions. Also, note that by importing `'viem/window'` the global `window.ethereum` will be typed as an `EIP1193Provider`, check the [`window` Polyfill](https://viem.sh/docs/typescript#window-polyfill){target=\_blank} reference for more information. @@ -123,7 +163,31 @@ Create a folder called `abis` at the root of your project, then create a file na Next, create a file called `utils/contract.ts`: ```typescript title="contract.ts" +import { getContract } from 'viem'; +import { publicClient, getWalletClient } from './viem'; +import StorageABI from '../../abis/Storage.json'; + +export const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f'; +export const CONTRACT_ABI = StorageABI; + +// Create a function to get a contract instance for reading +export const getContractInstance = () => { + return getContract({ + address: CONTRACT_ADDRESS, + abi: CONTRACT_ABI, + client: publicClient, + }); +}; +// Create a function to get a contract instance with a signer for writing +export const getSignedContract = async () => { + const walletClient = await getWalletClient(); + return getContract({ + address: CONTRACT_ADDRESS, + abi: CONTRACT_ABI, + client: walletClient, + }); +}; ``` This file defines the contract address, ABI, and functions to create a viem [contract instance](https://viem.sh/docs/contract/getContract#contract-instances){target=\_blank} for reading and writing operations. viem's contract utilities ensure a more efficient and type-safe interaction with smart contracts. @@ -133,7 +197,180 @@ This file defines the contract address, ABI, and functions to create a viem [con Now, let's create a component to handle wallet connections. Create a new file called `components/WalletConnect.tsx`: ```typescript title="WalletConnect.tsx" +"use client"; + +import React, { useState, useEffect } from "react"; +import { passetHub } from "../utils/viem"; + +interface WalletConnectProps { + onConnect: (account: string) => void; +} + +const WalletConnect: React.FC = ({ onConnect }) => { + const [account, setAccount] = useState(null); + const [chainId, setChainId] = useState(null); + const [error, setError] = useState(null); + + useEffect(() => { + // Check if user already has an authorized wallet connection + const checkConnection = async () => { + if (typeof window !== 'undefined' && window.ethereum) { + try { + // eth_accounts doesn't trigger the wallet popup + const accounts = await window.ethereum.request({ + method: 'eth_accounts', + }) as string[]; + + if (accounts.length > 0) { + setAccount(accounts[0]); + const chainIdHex = await window.ethereum.request({ + method: 'eth_chainId', + }) as string; + setChainId(parseInt(chainIdHex, 16)); + onConnect(accounts[0]); + } + } catch (err) { + console.error('Error checking connection:', err); + setError('Failed to check wallet connection'); + } + } + }; + + checkConnection(); + + if (typeof window !== 'undefined' && window.ethereum) { + // Setup wallet event listeners + window.ethereum.on('accountsChanged', (accounts: string[]) => { + setAccount(accounts[0] || null); + if (accounts[0]) onConnect(accounts[0]); + }); + + window.ethereum.on('chainChanged', (chainIdHex: string) => { + setChainId(parseInt(chainIdHex, 16)); + }); + } + return () => { + // Cleanup event listeners + if (typeof window !== 'undefined' && window.ethereum) { + window.ethereum.removeListener('accountsChanged', () => {}); + window.ethereum.removeListener('chainChanged', () => {}); + } + }; + }, [onConnect]); + + const connectWallet = async () => { + if (typeof window === 'undefined' || !window.ethereum) { + setError( + 'MetaMask not detected! Please install MetaMask to use this dApp.' + ); + return; + } + + try { + // eth_requestAccounts triggers the wallet popup + const accounts = await window.ethereum.request({ + method: 'eth_requestAccounts', + }) as string[]; + + setAccount(accounts[0]); + + const chainIdHex = await window.ethereum.request({ + method: 'eth_chainId', + }) as string; + + const currentChainId = parseInt(chainIdHex, 16); + setChainId(currentChainId); + + // Prompt user to switch networks if needed + if (currentChainId !== passetHub.id) { + await switchNetwork(); + } + + onConnect(accounts[0]); + } catch (err) { + console.error('Error connecting to wallet:', err); + setError('Failed to connect wallet'); + } + }; + + const switchNetwork = async () => { + console.log('Switch network') + try { + await window.ethereum.request({ + method: 'wallet_switchEthereumChain', + params: [{ chainId: `0x${passetHub.id.toString(16)}` }], + }); + } catch (switchError: any) { + // Error 4902 means the chain hasn't been added to MetaMask + if (switchError.code === 4902) { + try { + await window.ethereum.request({ + method: 'wallet_addEthereumChain', + params: [ + { + chainId: `0x${passetHub.id.toString(16)}`, + chainName: passetHub.name, + rpcUrls: [passetHub.rpcUrls.default.http[0]], + nativeCurrency: { + name: passetHub.nativeCurrency.name, + symbol: passetHub.nativeCurrency.symbol, + decimals: passetHub.nativeCurrency.decimals, + }, + }, + ], + }); + } catch (addError) { + setError('Failed to add network to wallet'); + } + } else { + setError('Failed to switch network'); + } + } + }; + + // UI-only disconnection - MetaMask doesn't support programmatic disconnection + const disconnectWallet = () => { + setAccount(null); + }; + + return ( +
+ {error &&

{error}

} + + {!account ? ( + + ) : ( +
+ + {`${account.substring(0, 6)}...${account.substring(38)}`} + + + {chainId !== passetHub.id && ( + + )} +
+ )} +
+ ); +}; + +export default WalletConnect; ``` This component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. It provides a button for users to connect their wallet and displays the connected account address once connected. @@ -142,9 +379,24 @@ To use this component in your dApp, replace the existing boilerplate in `app/pag ```typescript title="page.tsx" +import { useState } from "react"; +import WalletConnect from "./components/WalletConnect"; +export default function Home() { + const [account, setAccount] = useState(null); + const handleConnect = (connectedAccount: string) => { + setAccount(connectedAccount); + }; - + return ( +
+

+ Viem dApp - Passet Hub Smart Contracts +

+ +
+ ); +} ``` Now you're ready to run your dApp. From your project directory, execute: @@ -162,7 +414,70 @@ Navigate to `http://localhost:3000` in your browser, and you should see your dAp Now, let's create a component to read data from the contract. Create a file called `components/ReadContract.tsx`: ```typescript title="ReadContract.tsx" +'use client'; + +import React, { useState, useEffect } from 'react'; +import { publicClient } from '../utils/viem'; +import { CONTRACT_ADDRESS, CONTRACT_ABI } from '../utils/contract'; + +const ReadContract: React.FC = () => { + const [storedNumber, setStoredNumber] = useState(null); + const [loading, setLoading] = useState(true); + const [error, setError] = useState(null); + + useEffect(() => { + // Function to read data from the blockchain + const fetchData = async () => { + try { + setLoading(true); + // Call the smart contract's storedNumber function + const number = await publicClient.readContract({ + address: CONTRACT_ADDRESS, + abi: CONTRACT_ABI, + functionName: 'storedNumber', + args: [], + }) as bigint; + + setStoredNumber(number.toString()); + setError(null); + } catch (err) { + console.error('Error fetching stored number:', err); + setError('Failed to fetch data from the contract'); + } finally { + setLoading(false); + } + }; + + fetchData(); + // Poll for updates every 10 seconds to keep UI in sync with blockchain + const interval = setInterval(fetchData, 10000); + + // Clean up interval on component unmount + return () => clearInterval(interval); + }, []); + + return ( +
+

Contract Data

+ {loading ? ( +
+
+
+ ) : error ? ( +

{error}

+ ) : ( +
+

+ Stored Number: {storedNumber} +

+
+ )} +
+ ); +}; + +export default ReadContract; ``` This component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically, ensuring that the UI stays in sync with the blockchain state. @@ -171,9 +486,26 @@ To reflect this change in your dApp, incorporate this component into the `app/pa ```typescript title="page.tsx" +import { useState } from "react"; +import WalletConnect from "./components/WalletConnect"; +import ReadContract from "./components/ReadContract"; +export default function Home() { + const [account, setAccount] = useState(null); + const handleConnect = (connectedAccount: string) => { + setAccount(connectedAccount); + }; - + return ( +
+

+ Viem dApp - Passet Hub Smart Contracts +

+ + +
+ ); +} ``` And you will see in your browser: @@ -409,7 +741,31 @@ This component allows users to input a new number and send a transaction to upda Update the `app/page.tsx` file to integrate all components: ```typescript title="page.tsx" +"use client"; + +import { useState } from "react"; +import WalletConnect from "./components/WalletConnect"; +import ReadContract from "./components/ReadContract"; +import WriteContract from "./components/WriteContract"; + +export default function Home() { + const [account, setAccount] = useState(null); + const handleConnect = (connectedAccount: string) => { + setAccount(connectedAccount); + }; + + return ( +
+

+ Viem dApp - Passet Hub Smart Contracts +

+ + + +
+ ); +} ``` After that, you will see: diff --git a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat.md b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat.md index b8b568f72..db839e043 100644 --- a/.ai/pages/tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat.md +++ b/.ai/pages/tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat.md @@ -62,7 +62,42 @@ Let's start by setting up Hardhat for your Storage contract project: 6. Configure Hardhat by updating the `hardhat.config.js` file: ```javascript title="hardhat.config.js" - + require("@nomicfoundation/hardhat-toolbox"); + + require("@parity/hardhat-polkadot"); + + const { vars } = require("hardhat/config"); + + /** @type import('hardhat/config').HardhatUserConfig */ + module.exports = { + solidity: "0.8.28", + resolc: { + compilerSource: "npm", + }, + networks: { + hardhat: { + polkavm: true, + nodeConfig: { + nodeBinaryPath: 'INSERT_PATH_TO_SUBSTRATE_NODE', + rpcPort: 8000, + dev: true, + }, + adapterConfig: { + adapterBinaryPath: 'INSERT_PATH_TO_ETH_RPC_ADAPTER', + dev: true, + }, + }, + localNode: { + polkavm: true, + url: `http://127.0.0.1:8545`, + }, + passetHub: { + polkavm: true, + url: 'https://testnet-passet-hub-eth-rpc.polkadot.io', + accounts: [vars.get("PRIVATE_KEY")], + }, + }, + }; ``` Ensure that `INSERT_PATH_TO_SUBSTRATE_NODE` and `INSERT_PATH_TO_ETH_RPC_ADAPTER` are replaced with the proper paths to the compiled binaries. @@ -316,7 +351,13 @@ Testing is a critical part of smart contract development. Hardhat makes it easy 1. Create a new folder called`ignition/modules`. Add a new file named `StorageModule.js` with the following logic: ```javascript title="StorageModule.js" - + const { buildModule } = require('@nomicfoundation/hardhat-ignition/modules'); + + module.exports = buildModule('StorageModule', (m) => { + const storage = m.contract('Storage'); + + return { storage }; + }); ``` 2. Deploy to the local network: diff --git a/.ai/site-index.json b/.ai/site-index.json index 7129eaa28..159f8c0f3 100644 --- a/.ai/site-index.json +++ b/.ai/site-index.json @@ -54,7 +54,7 @@ "estimated_token_count_total": 1505 }, "hash": "sha256:2b017d8a89f8734b9cbb501f03612a22657d2f8d4d85c51e490e4c8ca4bf771b", - "last_modified": "2025-10-24T21:11:54+00:00", + "last_modified": "2025-10-27T13:59:09+00:00", "token_estimator": "heuristic-v1" }, { @@ -106,13 +106,13 @@ } ], "stats": { - "chars": 7032, - "words": 970, + "chars": 7146, + "words": 978, "headings": 7, - "estimated_token_count_total": 1620 + "estimated_token_count_total": 1635 }, - "hash": "sha256:c31f4cc6f58644a6a03957d32e74e65f30e6b4f5214416a9e379de64144a0833", - "last_modified": "2025-10-24T21:11:54+00:00", + "hash": "sha256:46252e238b0b51105148dc622da6d8809c55ec11da7ec7b2953c35ca52f5f585", + "last_modified": "2025-10-27T13:59:10+00:00", "token_estimator": "heuristic-v1" }, { @@ -149,13 +149,13 @@ } ], "stats": { - "chars": 5974, - "words": 747, + "chars": 7729, + "words": 836, "headings": 4, - "estimated_token_count_total": 1271 + "estimated_token_count_total": 1487 }, - "hash": "sha256:84645b90dc0da9db8e83de27cf4c30bfff320498e249f43d6845b782a0dcd082", - "last_modified": "2025-10-24T21:11:55+00:00", + "hash": "sha256:b07cb65636f24dbff99f21a5c6e4ac047e6455a879c3f9bbf692514fc24da17b", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -222,7 +222,7 @@ "estimated_token_count_total": 955 }, "hash": "sha256:72ee7394fd1308c111a8d548cb4dc63c6b9bc5b6e2bb556dd1baacbaedb92286", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -274,7 +274,7 @@ "estimated_token_count_total": 876 }, "hash": "sha256:d6cb22337280a19bdf24981dcba98f337d48ee4f79ce7ac040466ef1cb4b330b", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -356,7 +356,7 @@ "estimated_token_count_total": 2744 }, "hash": "sha256:1a2d34ccab19bd71263763bbc294977acf34f5800398f51398753594cfc7d7a6", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -428,7 +428,7 @@ "estimated_token_count_total": 608 }, "hash": "sha256:7bba6105d99721373aa6f494627d20af97b1851c19703f26be26c32f0c83524b", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -495,7 +495,7 @@ "estimated_token_count_total": 558 }, "hash": "sha256:b79fe56c9604712825bdf30d17667fd8f237fce9691be0d8d042d38691dbba7a", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -547,7 +547,7 @@ "estimated_token_count_total": 348 }, "hash": "sha256:11cd8d428fa9c3e70490da5c63ce4597cd89ec46306d2bb49b016ced6aa68c3d", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -574,7 +574,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:3821c2ef97699091b76e1de58e6d95e866df69d39fca16f2a15c156b71da5b22", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -601,7 +601,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:634e299f347beb8ad690697943bb7f99915d62d40cda4227179619ed18abe2ff", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -649,7 +649,7 @@ "estimated_token_count_total": 1373 }, "hash": "sha256:fc85c27ad58c1ca6d0e1fcded4b8e2b6e3d0e888ed4aa99158e21a5e799f5e6b", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -702,7 +702,7 @@ "estimated_token_count_total": 4979 }, "hash": "sha256:ed3b7c8101b69f9c907cca7c5edfef67fdb5e7bc3c8df8d9fbad297f9dd3c80a", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -759,7 +759,7 @@ "estimated_token_count_total": 1781 }, "hash": "sha256:35c71a215558cd0642d363e4515ad240093995d42720e6495cd2994c859243e4", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -806,7 +806,7 @@ "estimated_token_count_total": 1449 }, "hash": "sha256:346061a3b851699f815068b42a949f7a2259f6ece083c97cf35538cb7bd4e547", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -853,7 +853,7 @@ "estimated_token_count_total": 1082 }, "hash": "sha256:ec82957c768c2c07a272e7a28659c812b223df836e21372b1642f0bb249d7b39", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -895,7 +895,7 @@ "estimated_token_count_total": 4182 }, "hash": "sha256:25a2c4b5830df38e0aacec94d288179064742759e7df31fcb9905ad405e78fc3", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -922,7 +922,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:63584f5b1dab7b67b18b35b47dfc19d00ad5c013804772f0d653a11ac3fca38d", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -954,7 +954,7 @@ "estimated_token_count_total": 328 }, "hash": "sha256:30ffcc12fff151fd0fa1baedfa803ecbb15106504df99c5a032ca173fffe0eca", - "last_modified": "2025-10-24T21:11:55+00:00", + "last_modified": "2025-10-27T13:59:12+00:00", "token_estimator": "heuristic-v1" }, { @@ -1016,13 +1016,13 @@ } ], "stats": { - "chars": 32843, - "words": 3133, + "chars": 33184, + "words": 3164, "headings": 9, - "estimated_token_count_total": 6440 + "estimated_token_count_total": 6512 }, - "hash": "sha256:4d86accdf9d31b7763b22ab53b78ea3008d37b634ee7e454ff8e9adbd0876698", - "last_modified": "2025-10-24T21:11:57+00:00", + "hash": "sha256:43a4a5832611e49024022c1e9e825742919017f959036bbcb82e82622d0daa18", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1054,7 +1054,7 @@ "estimated_token_count_total": 390 }, "hash": "sha256:3ca63851d29942ed00d3f143930edd1248a6321d1d8f6473ae697b72b0b9116e", - "last_modified": "2025-10-24T21:11:54+00:00", + "last_modified": "2025-10-27T13:59:09+00:00", "token_estimator": "heuristic-v1" }, { @@ -1117,7 +1117,7 @@ "estimated_token_count_total": 1520 }, "hash": "sha256:ed09ef7a6abe21204006186fd5791ada7597688fad67e30244dc449c51330309", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1179,7 +1179,7 @@ "estimated_token_count_total": 2598 }, "hash": "sha256:b2b3d8c048863e7760f633b12ab2a0202c741be3050ea4beafb9a7265cfe96b5", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1236,7 +1236,7 @@ "estimated_token_count_total": 1219 }, "hash": "sha256:262e7a3ad3d0a0102897c52c7589e3f94c7827c441398b3b446b205f6c6753d3", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1278,7 +1278,7 @@ "estimated_token_count_total": 905 }, "hash": "sha256:ad8e6d9c77d5451c5f4d17f8e6311b21e6ad24eae8780fd4c3ae6013744822cf", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1345,7 +1345,7 @@ "estimated_token_count_total": 3995 }, "hash": "sha256:19997d390abf2847824024ba923f46a61106ef77544d256d50b371210816b309", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1413,7 +1413,7 @@ "estimated_token_count_total": 2021 }, "hash": "sha256:d253314c3db3e631a43137fbc9756eac3143c86c49a3d7a6c109f070f384ef84", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1445,7 +1445,7 @@ "estimated_token_count_total": 326 }, "hash": "sha256:705127e925f797216ab35ca7d0b4bb4fe56ee3c252318d35678e2d5f330a6571", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1506,13 +1506,13 @@ } ], "stats": { - "chars": 7622, - "words": 1153, + "chars": 8470, + "words": 1227, "headings": 9, - "estimated_token_count_total": 1771 + "estimated_token_count_total": 1944 }, - "hash": "sha256:0f8b6191da1d8ed5c569081d33e8ae821c76c8272aefc5009ddc1754100d45a0", - "last_modified": "2025-10-24T21:11:57+00:00", + "hash": "sha256:4fc8cab40e982e860b64d9aede1058fe7fa82ec321ac215b919db00c4df0a9c0", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1584,7 +1584,7 @@ "estimated_token_count_total": 3072 }, "hash": "sha256:ea36f84c753f4671c27d2d5ad1f785ddadb6b333a7436208dc9b61d5b079cf21", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1666,7 +1666,7 @@ "estimated_token_count_total": 3023 }, "hash": "sha256:f89b54fce05c6e26b58bc8a38694953422faf4a3559799a7d2f70dcfd6176304", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1718,7 +1718,7 @@ "estimated_token_count_total": 1287 }, "hash": "sha256:9686bce57413e86675e88ef7a2ce1e1f70226d10c7df8125f3c2bc7f729fcedd", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1770,7 +1770,7 @@ "estimated_token_count_total": 744 }, "hash": "sha256:358ed14147b96b47deb61df9a1ea0e1103a139ea5edb78c5d50a48d5a779b80d", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1807,7 +1807,7 @@ "estimated_token_count_total": 957 }, "hash": "sha256:1acbec60b62ffd3359fa04d224e8be0154d6d115a65b2e33d119905d382a7f17", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1890,7 +1890,7 @@ "estimated_token_count_total": 2709 }, "hash": "sha256:2ee5656f749b4bca445172f2bc66c7fc39af40ff173626662ae4c399f49cf909", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -1948,7 +1948,7 @@ "estimated_token_count_total": 1892 }, "hash": "sha256:74de798c287cae75729e7db54019507f03a361dbbd1f2bb58c4694605f83efab", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -2000,7 +2000,7 @@ "estimated_token_count_total": 4725 }, "hash": "sha256:b17e06e9e6bced8db89c193fac16c297b7f263c4a6613bf290b162f0f651ddb6", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -2062,7 +2062,7 @@ "estimated_token_count_total": 1819 }, "hash": "sha256:b0c1535fa8e969a9bdeee426a5a35a42b4649121fb8ce6fd2b15fdeba35b5d5f", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -2109,7 +2109,7 @@ "estimated_token_count_total": 1161 }, "hash": "sha256:07e63e1e99b9acf1cc3b5ef8fa1f06ff22182b2a801582ce800eba37d7d39408", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -2175,13 +2175,13 @@ } ], "stats": { - "chars": 14162, - "words": 1849, + "chars": 18500, + "words": 2363, "headings": 10, - "estimated_token_count_total": 2936 + "estimated_token_count_total": 4014 }, - "hash": "sha256:59387d27cb1f775bdb575cd61e9168c63ebc93e0afe020e93ea6fce3d2f61f5b", - "last_modified": "2025-10-24T21:11:57+00:00", + "hash": "sha256:55dc252fdecf1590048ce8d009b822e90231442abe81e9593cf1635944a31336", + "last_modified": "2025-10-27T13:59:18+00:00", "token_estimator": "heuristic-v1" }, { @@ -2233,7 +2233,7 @@ "estimated_token_count_total": 2030 }, "hash": "sha256:f4964f894f7cd2fdfd699c017b4bd25cffc322b03a5a88a36c682cf952832ccc", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:18+00:00", "token_estimator": "heuristic-v1" }, { @@ -2265,7 +2265,7 @@ "estimated_token_count_total": 230 }, "hash": "sha256:f786ec04fd5c7179716a160f93f6bff3c497839cc3627e2279d9bec234ce0c3c", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -2326,13 +2326,13 @@ } ], "stats": { - "chars": 13835, - "words": 1787, + "chars": 14731, + "words": 1881, "headings": 9, - "estimated_token_count_total": 3074 + "estimated_token_count_total": 3342 }, - "hash": "sha256:50d751cce37dd3db81dcc3c6014dcc2b1dd7b0c95bfd78c2d69b6d06f6444e37", - "last_modified": "2025-10-24T21:11:58+00:00", + "hash": "sha256:9d6daa3f4daf149ae822b60060d14ff022bd4b3440cecdc969a48c105eb82a21", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -2384,7 +2384,7 @@ "estimated_token_count_total": 1572 }, "hash": "sha256:68fc67390e24741081c9a04d78951e76c7d4ff7cf6eddaba7dcbbdc1812c71d3", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -2441,7 +2441,7 @@ "estimated_token_count_total": 1559 }, "hash": "sha256:0024f5e4c12ab7b019e5ee183e7c78d175e1125868c5458b97d3accd9fac75bc", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -2473,7 +2473,7 @@ "estimated_token_count_total": 205 }, "hash": "sha256:16f4f67b56ecef53c3c7ab09c438dcc9d4e613b0824df5b1691bd7c4f6296eda", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -2505,7 +2505,7 @@ "estimated_token_count_total": 341 }, "hash": "sha256:94488b5170bf7d37c72418d53e24ac36b124abf0ab1cf125f16e75187e626e4e", - "last_modified": "2025-10-24T21:11:57+00:00", + "last_modified": "2025-10-27T13:59:16+00:00", "token_estimator": "heuristic-v1" }, { @@ -2548,7 +2548,7 @@ "estimated_token_count_total": 313 }, "hash": "sha256:dae93f5037ef8a3f508da802a016df748ce0aed69620348b9895f712609f7e84", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -2585,7 +2585,7 @@ "estimated_token_count_total": 505 }, "hash": "sha256:758eb0881fd029ab949ca4ac17ed0dd50c8f1e1e6109f7d0f36416a1082b21e7", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -2622,7 +2622,7 @@ "estimated_token_count_total": 570 }, "hash": "sha256:1247dfb5f5ac040bca81cd1002153e0ee53f4052b2a3d40b623834bd7f00d065", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -2699,7 +2699,7 @@ "estimated_token_count_total": 6228 }, "hash": "sha256:72e41f816f07026d96c803f399c71852aa1151c464e79cec3e1746b282d5eaae", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -2777,7 +2777,7 @@ "estimated_token_count_total": 4188 }, "hash": "sha256:fe008393aa37c27bb71b4483d4e2c4fbcda94f8c1be461fdd07eff40efbb4e26", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -2840,7 +2840,7 @@ "estimated_token_count_total": 1375 }, "hash": "sha256:8e6bfed5fa59bb748e80698ea702f62ce6951c48bdb955ee9ef0d3516e856887", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -2872,7 +2872,7 @@ "estimated_token_count_total": 323 }, "hash": "sha256:5c3a10769e30b4da62e6c188e99310354e6e9af4595c7920c2977a54b8e1853c", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3019,7 +3019,7 @@ "estimated_token_count_total": 1605 }, "hash": "sha256:89410eccd72495aa0a4eecf229c74a8f2db31994c6a03b9957c2be92ea227520", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3216,7 +3216,7 @@ "estimated_token_count_total": 9750 }, "hash": "sha256:1fb7a20bc4a799a771954720428029419ec73afa640e589590c43dd041a7e307", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3304,7 +3304,7 @@ "estimated_token_count_total": 4475 }, "hash": "sha256:f0cee7ccb3cd294e8f909a220bb63987239ef8155c187a04f8c4864ffdcde288", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3397,7 +3397,7 @@ "estimated_token_count_total": 3900 }, "hash": "sha256:a7541553a50a250521c0a280f997d614763c643b1028147f3fb61391950bda15", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3470,7 +3470,7 @@ "estimated_token_count_total": 3250 }, "hash": "sha256:bc771f912627fa09cad64adab1bc81c052f650d6c5a3b4f0c91883a98f6628da", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3543,7 +3543,7 @@ "estimated_token_count_total": 3033 }, "hash": "sha256:bc87533eaf42a979a0c17f50ecdc668c364889257c7e0d27b81129770660fd53", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3596,7 +3596,7 @@ "estimated_token_count_total": 2512 }, "hash": "sha256:5d13a0873a78a9802b06686d7caafbf4d23b6ba1edf7d3518943301f2b0110c4", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3628,7 +3628,7 @@ "estimated_token_count_total": 436 }, "hash": "sha256:fa9fb58c7fb7a1c86f147b9c95d3ef65a4aed6b559989dc2d439efec21a80be4", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3670,7 +3670,7 @@ "estimated_token_count_total": 2432 }, "hash": "sha256:809d0ff921587f29045df1d31a5a9fe32ee13fa7b9698aa27ff9f60b2aa7a4d8", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3743,7 +3743,7 @@ "estimated_token_count_total": 1118 }, "hash": "sha256:0468268436ffdb759cad8390a838d5fba2391118baa8fd8cd494b36397b10329", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3819,13 +3819,13 @@ } ], "stats": { - "chars": 11136, - "words": 1417, + "chars": 18009, + "words": 2181, "headings": 12, - "estimated_token_count_total": 2366 + "estimated_token_count_total": 3820 }, - "hash": "sha256:454593143212705cbf14c3c6277c42c3ca6aa631c204918e8332833f591887c3", - "last_modified": "2025-10-24T21:12:00+00:00", + "hash": "sha256:10c2497147b1d5404e8ab22432832d54fd0ebcf5eb36bbbe9e2d38308a1dfe72", + "last_modified": "2025-10-27T13:59:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -3892,7 +3892,7 @@ "estimated_token_count_total": 2317 }, "hash": "sha256:605d2cbb7eabb2ea0fd928bc3ecdf9ee8b095e3dd9643f2b0918fef7b5a3f4a8", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -3934,7 +3934,7 @@ "estimated_token_count_total": 1223 }, "hash": "sha256:798353114d43dee2873e28b293876c0761e2fef596bc3327c5986a4343c70da1", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -3987,7 +3987,7 @@ "estimated_token_count_total": 1638 }, "hash": "sha256:807cee6869059dd933905d1cf6c76e3b86e02baee3de3113f7e5b4c8697fbd22", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4019,7 +4019,7 @@ "estimated_token_count_total": 190 }, "hash": "sha256:97ce7c57e3795b573b61011b0f5430da5a52380214d892ee58ee2aa61778caef", - "last_modified": "2025-10-24T21:11:58+00:00", + "last_modified": "2025-10-27T13:59:19+00:00", "token_estimator": "heuristic-v1" }, { @@ -4087,7 +4087,7 @@ "estimated_token_count_total": 2300 }, "hash": "sha256:ba24e31e2ad94fbf1d73f1878da92dd2e1476db00170780bbdf0e65ab18bc961", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4140,7 +4140,7 @@ "estimated_token_count_total": 1987 }, "hash": "sha256:2ca93b09d3bb9159bbf53816886a9b242bb3c13b996c51fd52962e049e2d5477", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4213,7 +4213,7 @@ "estimated_token_count_total": 1084 }, "hash": "sha256:7f533abe61586af8438e350c41b741d74a8edb839f9dc4139bc4619ba3748258", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4281,7 +4281,7 @@ "estimated_token_count_total": 1166 }, "hash": "sha256:ed3986f30880fefca5975fcdc847c68b4aca65862c63e3002b25391b0521781d", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4339,7 +4339,7 @@ "estimated_token_count_total": 942 }, "hash": "sha256:8987fc35cd28602054ee018031f773e2e3837425107c51d0e2ac68a94b86e9c0", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4392,7 +4392,7 @@ "estimated_token_count_total": 1945 }, "hash": "sha256:b8759f61ab57b636228b69d5770c74591998b912cd4596e89eb2ec011da7ef73", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4465,7 +4465,7 @@ "estimated_token_count_total": 2187 }, "hash": "sha256:56269d9ea47f5b4e92cd7d5a1e65ab06d181a9c380f90bb3ef285529b12299f7", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4497,7 +4497,7 @@ "estimated_token_count_total": 185 }, "hash": "sha256:900e54d04b11533efb15a34ebd76dc095a1873a926ecf2a5ce494cf0633c8be1", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4535,7 +4535,7 @@ "estimated_token_count_total": 428 }, "hash": "sha256:cfcc76bb24779c9b613f2c046b6f99a0f2529c25fd82287d804f6b945b936227", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4568,7 +4568,7 @@ "estimated_token_count_total": 245 }, "hash": "sha256:6d8e01281a5895fd2bc4438b24c170c72a496de0b838626a53e87685aea4aa25", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4625,7 +4625,7 @@ "estimated_token_count_total": 847 }, "hash": "sha256:a206dd86fc3d80aed22384000839ca0c9c75c69ad461abd9810d96c03cf6a3bd", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4697,7 +4697,7 @@ "estimated_token_count_total": 6312 }, "hash": "sha256:d132a135b7be0a571277eabd4f76781fe02de29d692c1e958560613ec25e891f", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4740,7 +4740,7 @@ "estimated_token_count_total": 834 }, "hash": "sha256:ef2cc8c69ca34dd35a012c361d5a7ce72dab888b4ef674f62310a1d3914c6554", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4772,7 +4772,7 @@ "estimated_token_count_total": 92 }, "hash": "sha256:0de8c1655a1524784010b6cec5fa522b2f764e32f18913f0d262283e0ec0779e", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4834,7 +4834,7 @@ "estimated_token_count_total": 5209 }, "hash": "sha256:966ec1bcc014a454f6b837b503025d9fb89c30f6a65d0aaec82ea5ff976e53a9", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4871,7 +4871,7 @@ "estimated_token_count_total": 683 }, "hash": "sha256:8dc107b7323ca24d3a781ca37b89580aa6a77232a4f6109ca24a1048cd733123", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4930,7 +4930,7 @@ "estimated_token_count_total": 1700 }, "hash": "sha256:47328231d6ff4dc52cd93aaf1baf5d0bc2d9fc372f3d79339d87aafa0dabd1b8", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -4957,7 +4957,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:c72d7d30a019fe1db8ab3993f91dfd4f1bdb4a932aaa685d3baaa0578091d5ce", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5025,7 +5025,7 @@ "estimated_token_count_total": 2453 }, "hash": "sha256:2c77cfb38bb2e466a8f56dabbb706fcd2e90cf1634fc9beb7f0ee95a75735653", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5052,7 +5052,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:cc49fdcc63a43247d80de2f309b9c7501d3054782746d80c003d95f3c43da90d", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5125,7 +5125,7 @@ "estimated_token_count_total": 2614 }, "hash": "sha256:4325cdd697814b8043db808da3dee86d3d9c6fc7dd523aae7fe8914d59d1b39c", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5162,7 +5162,7 @@ "estimated_token_count_total": 285 }, "hash": "sha256:340c8e81fdaca8a1b85a9addeed75cc617513b39658250693e2516c74b86aa6e", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5199,7 +5199,7 @@ "estimated_token_count_total": 180 }, "hash": "sha256:6e71534a424f6a08521b19c9b4cf668e495fb7c591463ffe63d1b03a8b17e435", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5257,7 +5257,7 @@ "estimated_token_count_total": 1430 }, "hash": "sha256:1284c42be692167e01bcc44e2e134ec20615402675fac26df246c00aa1588d80", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5325,7 +5325,7 @@ "estimated_token_count_total": 2018 }, "hash": "sha256:49866761ef638dd0683bb5558f5319b9568ff136295b3359580a6f478172c73f", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5373,7 +5373,7 @@ "estimated_token_count_total": 1001 }, "hash": "sha256:165d1f1606785801860e8af5ff4c2d9d393b3bc07d211fc2e6757c70445a8124", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5400,7 +5400,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:91de375b7f822ed56b5e6b4d609d0d36e806d3f77041b4e180b6679b10a3e1c8", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5462,7 +5462,7 @@ "estimated_token_count_total": 1869 }, "hash": "sha256:978b4f2d2888ab26edeae998f09bead9ecd05460d229c63a8b2b2f4475438c69", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5549,7 +5549,7 @@ "estimated_token_count_total": 1870 }, "hash": "sha256:3b766e00e55a224201bc6744386a6dabc7da54ed9199b16abab3b94cff449eca", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5642,7 +5642,7 @@ "estimated_token_count_total": 9871 }, "hash": "sha256:0d7e04fd952cc9d5bd8cdbfd87cc4004c5f95e896a16bc7f89dfc4caeac8f371", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5705,7 +5705,7 @@ "estimated_token_count_total": 2661 }, "hash": "sha256:04e85c4cddb58252f8253d78a3924bb56952dac2a3e9a057704a91a0d1f21d75", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5742,7 +5742,7 @@ "estimated_token_count_total": 190 }, "hash": "sha256:1e474a9a1411a128abe943bdfabd8d5d27eaa7b52c5ba4c68379964fd27c6983", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5779,7 +5779,7 @@ "estimated_token_count_total": 168 }, "hash": "sha256:e26ea88a73f187ffbf9c7287f80b9e51604b92896b7c032b26b3d034d3c46b7d", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5811,7 +5811,7 @@ "estimated_token_count_total": 122 }, "hash": "sha256:3a3d8b02539e7aea22d26a8fb845b9e2d19ad3676220b521ab3f176128310698", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5838,7 +5838,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:20c667a337791538e3997f1f449bf69b248ccc4cc806e22615075f24fd3f0202", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5890,7 +5890,7 @@ "estimated_token_count_total": 1898 }, "hash": "sha256:45d815fe79b74b480570a333572fe5a0b0d2fdd33b4aabe2711d94094d6fee10", - "last_modified": "2025-10-24T21:11:54+00:00", + "last_modified": "2025-10-27T13:59:09+00:00", "token_estimator": "heuristic-v1" }, { @@ -5922,7 +5922,7 @@ "estimated_token_count_total": 2232 }, "hash": "sha256:5a8da69a5cea8bd598ee4d102b9abed5d1a29153802a567e22bb4ee720410b32", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -5994,7 +5994,7 @@ "estimated_token_count_total": 579 }, "hash": "sha256:4c33d0ec5026128b3bfdb1dfc1f4b29487404eaa8043071d536e8638356c6e1f", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6036,7 +6036,7 @@ "estimated_token_count_total": 557 }, "hash": "sha256:993e93b05c8fbdfc2f7510c61ac86bc4c2ff0f03e573695b2f260933c8b62f78", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6073,7 +6073,7 @@ "estimated_token_count_total": 280 }, "hash": "sha256:5bdc575ac798a971867a15651c2b4d5139bf0b1fe6854d1865deff280ae6d7f6", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6094,7 +6094,7 @@ "estimated_token_count_total": 0 }, "hash": "sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6161,7 +6161,7 @@ "estimated_token_count_total": 1044 }, "hash": "sha256:d84a5af1a0237a911d25a68c077f508ebbce608f673ef4f9055e8e434daa96b9", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6228,7 +6228,7 @@ "estimated_token_count_total": 4229 }, "hash": "sha256:abd9f939f68b068a18567b875c9f7e11d102c54fc02ca0e6ee8041c539061ed0", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6285,7 +6285,7 @@ "estimated_token_count_total": 1286 }, "hash": "sha256:0b43b452e9d709cb324bf51fd88c2fed8e6249534a7c2b852e1bd36bcb9b981a", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6327,7 +6327,7 @@ "estimated_token_count_total": 462 }, "hash": "sha256:c6087224da8140a4a5d8bbc3cb9b9b389ffd57f679dca7eb67df9f64649f0eaf", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6389,7 +6389,7 @@ "estimated_token_count_total": 1827 }, "hash": "sha256:1090b02689df5f4c59bb83f9c81436718d06e46f3b615bc655fef3c7b6c9fb02", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6471,7 +6471,7 @@ "estimated_token_count_total": 2559 }, "hash": "sha256:0857a9e83aefc6d3f04e8cb320ab82d35211bbd73d2eb2614cf7b97f8e6d36b9", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6552,13 +6552,13 @@ } ], "stats": { - "chars": 14805, - "words": 2330, + "chars": 15774, + "words": 2425, "headings": 13, - "estimated_token_count_total": 3667 + "estimated_token_count_total": 3827 }, - "hash": "sha256:cf981bba31ab6a031539e0f85c079c20d5d3e202c05ec47564ee5e1563771852", - "last_modified": "2025-10-24T21:12:01+00:00", + "hash": "sha256:e2567b7d5377c87984622cf93afe4bd8cedf46b80597736cf53f26b5f31c5065", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6605,7 +6605,7 @@ "estimated_token_count_total": 625 }, "hash": "sha256:9ab570299106336e5d75923b876247e8eb4a71851a77e84d68e0335e9da5e0a8", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6637,7 +6637,7 @@ "estimated_token_count_total": 404 }, "hash": "sha256:dad21b50f3732256f1367c2e79857f102635f0ed3015ed4013d7d0ca4d8b3a99", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6749,7 +6749,7 @@ "estimated_token_count_total": 5832 }, "hash": "sha256:a7b5239c3be0341ced8f28146e240ff6061fded2e71094bd586beeb024684a50", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6801,7 +6801,7 @@ "estimated_token_count_total": 861 }, "hash": "sha256:97655248c65e816fdf3d85dab4ace7ca0c145c50f671c25c24627cfd7660c7a6", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6858,7 +6858,7 @@ "estimated_token_count_total": 1167 }, "hash": "sha256:b2e8abce15fc9df106a5e972f28c64f606f9dd50ba3a256093eb53bdd5126224", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6890,7 +6890,7 @@ "estimated_token_count_total": 230 }, "hash": "sha256:f2cced19ba2b0b1ea46fd2f2892d328ac4797a1253d12cf479c64a447d7ce1ee", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6937,7 +6937,7 @@ "estimated_token_count_total": 1477 }, "hash": "sha256:76500d1d63f4205a84f0bc5b7f9aec945781127d41c32927280ac74bc14f0296", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -6969,7 +6969,7 @@ "estimated_token_count_total": 310 }, "hash": "sha256:7fb05e7b43cd5413b248605912ae0c6e24fd6c1ca199e64059c686c49d8cc456", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7036,7 +7036,7 @@ "estimated_token_count_total": 3409 }, "hash": "sha256:abe6bedab04f463ec07f554977b8d6355a5d2fad9bcda01cbe58568152295daa", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7088,7 +7088,7 @@ "estimated_token_count_total": 2617 }, "hash": "sha256:7d43408276d811c96b7b081a7b9f4d884893282a230b564c9eb3be2fc7857565", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7120,7 +7120,7 @@ "estimated_token_count_total": 343 }, "hash": "sha256:20f272dbbeb2b50a5e240b53ac45bed797ea58aa03e27c89194c941d66d8accf", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7152,7 +7152,7 @@ "estimated_token_count_total": 451 }, "hash": "sha256:2670bfa3c72e6b28c780cecdd4402691e616c2ab75e1d02a53834477ff1fcff8", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7200,7 +7200,7 @@ "estimated_token_count_total": 1127 }, "hash": "sha256:a476a8f00a86860deb76ceb77de6eecc3d5ed17601841165a53176512ef7d1a6", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7264,7 +7264,7 @@ "estimated_token_count_total": 1867 }, "hash": "sha256:4681fa2a9a5e44a52035ac9e58fb2f5e2abb667c36df94bfb1d4575293129134", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7291,7 +7291,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:3b9160b166d9b42b124f3b07eb26bdc5499fbbace6f951095009a5eee7fccbb6", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7338,7 +7338,7 @@ "estimated_token_count_total": 619 }, "hash": "sha256:00be43ac8d666bbe15c5c2fa5a5085697d0bb5a6f341ebbb943a209f0be355df", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7400,7 +7400,7 @@ "estimated_token_count_total": 1440 }, "hash": "sha256:2d228c52844df8952520fafdd3e6f0e26bfd2f32b5ee60c6241cf7d38603643c", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7479,7 +7479,7 @@ "estimated_token_count_total": 2600 }, "hash": "sha256:603890033f956552f0d7b524d0186a8ee256ac76aabf608808dbc992f7d089ab", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7566,7 +7566,7 @@ "estimated_token_count_total": 2534 }, "hash": "sha256:191df9b098e17e9de4597c9f8ced8abbafdfabc7e0f5c0a94d767fc2c9d7742b", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7593,7 +7593,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:c29356358f095b0d413e4c6525146b3f1b0b900853aada2168e7e55cd8dd6641", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7725,7 +7725,7 @@ "estimated_token_count_total": 4105 }, "hash": "sha256:759ab6dea0ad03c3f627558ea186d9f32351fa559acde82931684efc2da59d46", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7777,7 +7777,7 @@ "estimated_token_count_total": 1218 }, "hash": "sha256:26c156146ef9743fc26c6499294ff14186f97edbc2a34f445d3366b72f7148ae", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7809,7 +7809,7 @@ "estimated_token_count_total": 424 }, "hash": "sha256:59ec351fbb8d3a392e90f4f5bf6b62f58b21d6d7a900c5e367e5d2e09ecb3aca", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7851,7 +7851,7 @@ "estimated_token_count_total": 1238 }, "hash": "sha256:6340c8a885d03adf633ae30438d8f21c195c276a48082657bb22dd53341f1cfb", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7919,7 +7919,7 @@ "estimated_token_count_total": 1657 }, "hash": "sha256:4f8573882bd0f9b0bbbb45efa313c2f3bb90e4f90d09a8276b4b99d56b4b01d5", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -7981,7 +7981,7 @@ "estimated_token_count_total": 876 }, "hash": "sha256:8239d1e8d8642cb7c10e9e5f971c99b999e9e4a87373b50bf4a691225c1e4702", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -8008,7 +8008,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:6ef13c197dd1865fcc1a405d67486f1d053534d576bb32fe47a442fd2c11b6cd", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -8040,7 +8040,7 @@ "estimated_token_count_total": 177 }, "hash": "sha256:ffda04c93c70ec7204be28b642fa6e51f6bf9436d4792ecd25136696683f0902", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -8377,7 +8377,7 @@ "estimated_token_count_total": 5271 }, "hash": "sha256:f0e04286eacf23b182186f23e9854c0cd251545b8a8d561d2503f962dbfe32c0", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -8419,7 +8419,7 @@ "estimated_token_count_total": 631 }, "hash": "sha256:baba9dd41091b792d09005d55d3df0bf65b35f42b40ebe63caf425a0978a22b0", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -8487,7 +8487,7 @@ "estimated_token_count_total": 1611 }, "hash": "sha256:62beec261e72529f70e07a641177d489d2c8872f9c9d618cbadf1ac0fd881986", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -8519,7 +8519,7 @@ "estimated_token_count_total": 233 }, "hash": "sha256:58fd5c8c092ee748c2979164f985a67071a6ccb88492e79cdad536363364c858", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -8611,13 +8611,13 @@ } ], "stats": { - "chars": 28474, - "words": 4035, + "chars": 29648, + "words": 4201, "headings": 15, - "estimated_token_count_total": 6243 + "estimated_token_count_total": 6521 }, - "hash": "sha256:af802947fa4194cbfec09160bb21ff61f013e6f43efa379a835f1f14de9ab8f1", - "last_modified": "2025-10-24T21:12:01+00:00", + "hash": "sha256:1f9ce923b3ce296571fe63837c0d3c3c791a339ef02db09ead6b2b92e9d1bfd5", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -8680,7 +8680,7 @@ "estimated_token_count_total": 1399 }, "hash": "sha256:bcad23a74d962cab72b54cdc090bf9ee0cd5ecf79f70fb642f154668c2743983", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -8778,7 +8778,7 @@ "estimated_token_count_total": 4464 }, "hash": "sha256:299597c39d0e4e4902be8e45b354fff78a862aa5799e4f16d16787a97a1e3da8", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -8911,7 +8911,7 @@ "estimated_token_count_total": 4714 }, "hash": "sha256:e858bf6f7cf6af0525ffa8c8f5533e18b8ce0d6bbcd1b2acad71d31137a5f6b4", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -8938,7 +8938,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:235f33cdb64494815dbb3eb58ea98c69935098684e1b34b6d15356bc54b082ea", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9036,7 +9036,7 @@ "estimated_token_count_total": 3782 }, "hash": "sha256:eb4da21d561e9fd9333d97805318f0e263f54570120d3852ce7eba64da604cc2", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9119,7 +9119,7 @@ "estimated_token_count_total": 1797 }, "hash": "sha256:259dcef86aadc513675258b665cc3940db65af6eb32a5db85da6ac339966fa60", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9192,7 +9192,7 @@ "estimated_token_count_total": 3213 }, "hash": "sha256:e448294b6e52291ac0add5fa6533572814e6cd27af42bdaccc2000b86f52d775", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9250,7 +9250,7 @@ "estimated_token_count_total": 780 }, "hash": "sha256:077e7e5bfc9509cf09f455959a5da7a74b7af69836b3c4b334692f32e306ddf1", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9324,7 +9324,7 @@ "estimated_token_count_total": 1473 }, "hash": "sha256:695c624a1d7a3ed6fea0f4f5c19bb2100be986cec29ba58edb4598b9e9b98494", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9397,7 +9397,7 @@ "estimated_token_count_total": 914 }, "hash": "sha256:8122e21c149d0863cfe3b37fc5606bcdb91668e9d265f0f05451a61ff70e4e93", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9450,7 +9450,7 @@ "estimated_token_count_total": 1394 }, "hash": "sha256:217a79109aff1607594a0238fd91bfa812827620887c4f063c7e0a7a37f967d6", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9477,7 +9477,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:1514316acba1e9bba82ae1c82b09481e9d03d286e6f5d93b66e5a85fd4be7bca", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9545,7 +9545,7 @@ "estimated_token_count_total": 1822 }, "hash": "sha256:db2b1806153242680043ced536f64fc8a2ed3c09adc1bec5aa287168b48e0994", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9608,7 +9608,7 @@ "estimated_token_count_total": 1178 }, "hash": "sha256:9a6b3fa6c005d75c25f0f683b7d8c3b65891454743b794c12b005f910b81609c", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9721,7 +9721,7 @@ "estimated_token_count_total": 5303 }, "hash": "sha256:6078ea5afa297470ab65e55d8da9001ab23d6191c591d708c5007338eb252eb8", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9789,7 +9789,7 @@ "estimated_token_count_total": 891 }, "hash": "sha256:b5acdc9acf0e44836b8a4518155eba7d16cc3b103c557a00970ffb1c44c3e9f6", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9842,7 +9842,7 @@ "estimated_token_count_total": 2553 }, "hash": "sha256:40e799ce83609d6935f058e92cbb2f4c927b31ffcc6d6d7d257423b8388453e6", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9895,7 +9895,7 @@ "estimated_token_count_total": 994 }, "hash": "sha256:6992c9a2d1b315b64d9782880105cf2d436750249a84577aceb95cc213863009", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9927,7 +9927,7 @@ "estimated_token_count_total": 148 }, "hash": "sha256:e8dac01e89b7aac4b887e962e91084c253f5ea25c1abc3a56355390d0c3201c8", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -9954,7 +9954,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:49be4b4b5289572086eaaaf9ccff3bee7879b534188331c9a8052b3fe5aa4933", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -10021,7 +10021,7 @@ "estimated_token_count_total": 2251 }, "hash": "sha256:98f8303886011fb13fe8e7a32a8a6150f68703ec7c2a863a21050a35aebf2f36", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10053,7 +10053,7 @@ "estimated_token_count_total": 196 }, "hash": "sha256:fb892e81a2add1b64214c6cabe837d5068ebe54a7fb65e9149edbfb68f578a53", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10165,7 +10165,7 @@ "estimated_token_count_total": 4844 }, "hash": "sha256:96acff10be56dea76acdb5c915c1dde0eb15eb12eb95e7871eef56bab6cda273", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10237,7 +10237,7 @@ "estimated_token_count_total": 2375 }, "hash": "sha256:61bc251929352f2299ca1d413d05aa9c3672b914575a285d73c7ba53dbd75bff", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10289,7 +10289,7 @@ "estimated_token_count_total": 1461 }, "hash": "sha256:370ed10155cee84889a6d230d0bc3476597448f88a2a271ab87ef893a3268c18", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10326,7 +10326,7 @@ "estimated_token_count_total": 205 }, "hash": "sha256:c1f893d4086b0bf5d6b3c50d0c6cffe27d4deddf1240b250df6432eddcec969c", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10398,7 +10398,7 @@ "estimated_token_count_total": 7755 }, "hash": "sha256:086a87823ab67ceac102358030e316583cd733c0ec326316e7f29061fe7f6934", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10460,7 +10460,7 @@ "estimated_token_count_total": 2770 }, "hash": "sha256:581c8ac75aed22373939de6ad8396ee6e2840fbbaf495e81daf115d444a53017", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10487,7 +10487,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:09264d36777b5acb8cb1f3811e7f2bbe58641e0aac3afd74e426a2533dc9fa61", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10529,7 +10529,7 @@ "estimated_token_count_total": 343 }, "hash": "sha256:dcb1210e19815f3659ea283f8fc04dd5b85bc440a54e31e889f5ec5ae9229b78", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10611,7 +10611,7 @@ "estimated_token_count_total": 34507 }, "hash": "sha256:21ec1fdbd5e12b831a0c11e24ad4a4917e9d3dc9485d218c2ddc4372fc76e24b", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10643,7 +10643,7 @@ "estimated_token_count_total": 126 }, "hash": "sha256:b9c07713604ff9658363bf5e7a0726ecb7781418826ff65abffddffc8083d33f", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -10690,13 +10690,13 @@ } ], "stats": { - "chars": 10973, - "words": 1287, + "chars": 13025, + "words": 1519, "headings": 6, - "estimated_token_count_total": 2505 + "estimated_token_count_total": 3078 }, - "hash": "sha256:e074f9f36f0699a69f1613ff0bfa5d6e01619bdf653ea98196209947d10e205e", - "last_modified": "2025-10-24T21:12:03+00:00", + "hash": "sha256:a0b5f85630885f13f97381cc06d58139d4692528ca19ff53f599822a84b0f710", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -10789,7 +10789,7 @@ "estimated_token_count_total": 5338 }, "hash": "sha256:b3530f5fc5c9e916181dbc259a7fbae9c60100cb0450fc6d47bbb0d140afa075", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -10861,7 +10861,7 @@ "estimated_token_count_total": 4362 }, "hash": "sha256:a66380d109832bbcc12aee24399a728c8213fa2cf6912b7fe563034fc4775593", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -10918,7 +10918,7 @@ "estimated_token_count_total": 2138 }, "hash": "sha256:ff2c267284959711782c0d6ecb4b439c3a6cc31f763d5e1ff2cc3b1f6efb62b2", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -10970,7 +10970,7 @@ "estimated_token_count_total": 2929 }, "hash": "sha256:5d455d265430e71b2a8a8f5d4ba64caab329b5f6e959375ca7c5f8580ed00003", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11062,7 +11062,7 @@ "estimated_token_count_total": 4791 }, "hash": "sha256:24d101e192069fd4a1a829a7197c8878af0eabeca8ef5c3100ddbe46ec116488", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11144,7 +11144,7 @@ "estimated_token_count_total": 2559 }, "hash": "sha256:b446e5283fb0399f16b23a269bbbe8cca7ed08274fae7611e9e3a7aa921ae662", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11207,7 +11207,7 @@ "estimated_token_count_total": 3257 }, "hash": "sha256:5da581453e1e0b0b9659c2c74fe2c0c3e8060fa04445dd389644626e4232916d", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11234,7 +11234,7 @@ "estimated_token_count_total": 42 }, "hash": "sha256:06acc146698c1d3224544987d7ee52da498e3179228f98a494e385c5786a3a2c", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11271,7 +11271,7 @@ "estimated_token_count_total": 107 }, "hash": "sha256:fdd391227992c966de25b9240f5492135a9993859ec42b77952c1aa3d2e39ed9", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -11338,7 +11338,7 @@ "estimated_token_count_total": 4242 }, "hash": "sha256:2f11054e0d31c003ebae5d990b559bd56741d190ca409f6ad060216245fa2d17", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11395,7 +11395,7 @@ "estimated_token_count_total": 2263 }, "hash": "sha256:a6a535f4f5e145d3e2a7518739f752ee3ed37b7745483f414e21c97792331d18", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11443,7 +11443,7 @@ "estimated_token_count_total": 1571 }, "hash": "sha256:3ad540d8ad636304705cccb08bc1fdf21fe2fc7dc0f99bd509b23ae96d20e0ba", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11535,7 +11535,7 @@ "estimated_token_count_total": 2140 }, "hash": "sha256:388c988338ed84589c546bb1606d08641fb931dae307d3df92aeccd2e4986080", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11577,7 +11577,7 @@ "estimated_token_count_total": 397 }, "hash": "sha256:9044f2d9bca77f3e0062a47a52c592c756384850b051cb5be4f9373cff79440d", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11614,7 +11614,7 @@ "estimated_token_count_total": 229 }, "hash": "sha256:9d2299b006c2393409ba46f729c6970f9e4d485d5164be6465f5f390969cd881", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11683,7 +11683,7 @@ "estimated_token_count_total": 2737 }, "hash": "sha256:b3bae6a7538228ab76099223b4712df93d53727f2a559faf60516a3cd0165178", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11752,7 +11752,7 @@ "estimated_token_count_total": 2580 }, "hash": "sha256:b6570ad1b32bb07cf0ae33442e6189e1837bfa28f687e589ca189851aa292091", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11784,7 +11784,7 @@ "estimated_token_count_total": 80 }, "hash": "sha256:1dfbb8c3cfa27f92e982b4ce705415e117c50eb38f641691129863b474741da7", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11821,7 +11821,7 @@ "estimated_token_count_total": 255 }, "hash": "sha256:189ac2cf8bfd44fc76a6f45f504effe4ea11653c5d86c7fa825b918fdbd4f564", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -11879,7 +11879,7 @@ "estimated_token_count_total": 2670 }, "hash": "sha256:07629376480e74afc7fe4d91df539b6ab22453df0f8143df11cc51ef9a78f736", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11906,7 +11906,7 @@ "estimated_token_count_total": 12 }, "hash": "sha256:94dbafb2d78b87d5f0f0c75de002501b8210ac8d66072bc07989f685837cbac5", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:25+00:00", "token_estimator": "heuristic-v1" }, { @@ -11960,7 +11960,7 @@ "estimated_token_count_total": 2056 }, "hash": "sha256:cf9197d6909dd8865e8838cad95e3692fefaecc3d2f4773b26809a02051d620f", - "last_modified": "2025-10-24T21:12:03+00:00", + "last_modified": "2025-10-27T14:00:26+00:00", "token_estimator": "heuristic-v1" }, { @@ -12014,7 +12014,7 @@ "estimated_token_count_total": 2220 }, "hash": "sha256:aa6371024bb78c3eeedb6820a37859670046fd0e4f756ad417b20c39fb2983b9", - "last_modified": "2025-10-24T21:12:04+00:00", + "last_modified": "2025-10-27T14:00:27+00:00", "token_estimator": "heuristic-v1" }, { @@ -12061,13 +12061,13 @@ } ], "stats": { - "chars": 7480, - "words": 1056, + "chars": 8670, + "words": 1178, "headings": 6, - "estimated_token_count_total": 1560 + "estimated_token_count_total": 1760 }, - "hash": "sha256:31376f2d82192ea9199296355eb227b24ffe3c2dfefe38fdaa17177f269f9684", - "last_modified": "2025-10-24T21:12:04+00:00", + "hash": "sha256:4f3e2e50a595b0f93078ce0f63185a6f367ae341ae8afac10709c061c54b3307", + "last_modified": "2025-10-27T14:00:28+00:00", "token_estimator": "heuristic-v1" }, { @@ -12134,13 +12134,13 @@ } ], "stats": { - "chars": 7495, - "words": 1119, + "chars": 21978, + "words": 2569, "headings": 10, - "estimated_token_count_total": 1759 + "estimated_token_count_total": 5126 }, - "hash": "sha256:a443b1b89ce287cbaf3cdb5da04e843b8809eb46fb869cbaa17ce1bd92f6b7ab", - "last_modified": "2025-10-24T21:12:08+00:00", + "hash": "sha256:12cbf7c21f969c771ab046ddf836a1967f80e89e8e379d30e4ecdced43d3b160", + "last_modified": "2025-10-27T14:00:37+00:00", "token_estimator": "heuristic-v1" }, { @@ -12217,13 +12217,13 @@ } ], "stats": { - "chars": 16479, - "words": 2032, + "chars": 27950, + "words": 3205, "headings": 12, - "estimated_token_count_total": 3550 + "estimated_token_count_total": 6212 }, - "hash": "sha256:c3ab578ee1ceaf34852be04c882fc323cc422a894894ba71687f32863b701010", - "last_modified": "2025-10-24T21:12:13+00:00", + "hash": "sha256:1c637fe7a53d42add1a2ee09091d79f2086f2fe3091674e58b40574ae49bb2ec", + "last_modified": "2025-10-27T14:00:45+00:00", "token_estimator": "heuristic-v1" }, { @@ -12280,13 +12280,13 @@ } ], "stats": { - "chars": 17542, - "words": 2148, + "chars": 18691, + "words": 2236, "headings": 8, - "estimated_token_count_total": 3873 + "estimated_token_count_total": 4130 }, - "hash": "sha256:29bcad2dfbad9a6407097d5df7268c6e4beb4e6314d6bf832b295d2be5203136", - "last_modified": "2025-10-24T21:12:15+00:00", + "hash": "sha256:576067f5cd3a315dea7998f9bb729c93646e38b586f73402d05be6d75e38c80f", + "last_modified": "2025-10-27T14:00:52+00:00", "token_estimator": "heuristic-v1" }, { @@ -12318,7 +12318,7 @@ "estimated_token_count_total": 77 }, "hash": "sha256:8d8fc5f794d4c793586cd3d412627f5e2fe76f182c75c3687abcf33deed5d65e", - "last_modified": "2025-10-24T21:12:13+00:00", + "last_modified": "2025-10-27T14:00:45+00:00", "token_estimator": "heuristic-v1" }, { @@ -12355,7 +12355,7 @@ "estimated_token_count_total": 130 }, "hash": "sha256:66bc34a12c50539dde2ffc69fe66891f73d3e1a2da5833ada15e26744ff32209", - "last_modified": "2025-10-24T21:12:04+00:00", + "last_modified": "2025-10-27T14:00:27+00:00", "token_estimator": "heuristic-v1" }, { @@ -12397,7 +12397,7 @@ "estimated_token_count_total": 579 }, "hash": "sha256:eb544bbab067f4b3516190c6fe2df01049970bebb409095af4d3fd9b8bb771fe", - "last_modified": "2025-10-24T21:12:01+00:00", + "last_modified": "2025-10-27T13:59:28+00:00", "token_estimator": "heuristic-v1" } ] \ No newline at end of file diff --git a/llms-full.jsonl b/llms-full.jsonl index 0a3d748ba..cbca4f9c0 100644 --- a/llms-full.jsonl +++ b/llms-full.jsonl @@ -11,11 +11,11 @@ {"page_id": "develop-interoperability-send-messages", "page_title": "Send XCM Messages", "index": 3, "depth": 2, "title": "Primary Extrinsics of the XCM Pallet", "anchor": "primary-extrinsics-of-the-xcm-pallet", "start_char": 3620, "end_char": 3820, "estimated_token_count": 35, "token_estimator": "heuristic-v1", "text": "## Primary Extrinsics of the XCM Pallet\n\nThis page will highlight the two **Primary Primitive Calls** responsible for sending and executing XCVM programs as dispatchable functions within the pallet."} {"page_id": "develop-interoperability-send-messages", "page_title": "Send XCM Messages", "index": 4, "depth": 3, "title": "Execute", "anchor": "execute", "start_char": 3820, "end_char": 5071, "estimated_token_count": 298, "token_estimator": "heuristic-v1", "text": "### Execute\n\nThe [`execute`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/enum.Call.html#variant.execute){target=\\_blank} call directly interacts with the XCM executor, allowing for the execution of XCM messages originating from a locally signed origin. The executor validates the message, ensuring it complies with any configured barriers or filters before executing.\n\nOnce validated, the message is executed locally, and an event is emitted to indicate the result—whether the message was fully executed or only partially completed. Execution is capped by a maximum weight ([`max_weight`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/enum.Call.html#variant.execute.field.max_weight){target=\\_blank}); if the required weight exceeds this limit, the message will not be executed.\n\n```rust\npub fn execute(\n message: Box::RuntimeCall>>,\n max_weight: Weight,\n)\n```\n\nFor further details about the `execute` extrinsic, see the [`pallet-xcm` documentation](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/struct.Pallet.html){target=\\_blank}.\n\n!!!warning\n Partial execution of messages may occur depending on the constraints or barriers applied."} {"page_id": "develop-interoperability-send-messages", "page_title": "Send XCM Messages", "index": 5, "depth": 3, "title": "Send", "anchor": "send", "start_char": 5071, "end_char": 6081, "estimated_token_count": 254, "token_estimator": "heuristic-v1", "text": "### Send\n\nThe [`send`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/enum.Call.html#variant.send){target=\\_blank} call enables XCM messages to be sent to a specified destination. This could be a parachain, smart contract, or any external system governed by consensus. Unlike the execute call, the message is not executed locally but is transported to the destination chain for processing.\n\nThe destination is defined using a [Location](https://paritytech.github.io/polkadot-sdk/master/xcm_docs/glossary/index.html#location){target=\\_blank}, which describes the target chain or system. This ensures precise delivery through the configured XCM transport mechanism.\n\n```rust\npub fn send(\n dest: Box,\n message: Box::RuntimeCall>>,\n)\n```\n\nFor further information about the `send` extrinsic, see the [`pallet-xcm` documentation](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/struct.Pallet.html){target=\\_blank}."} -{"page_id": "develop-interoperability-send-messages", "page_title": "Send XCM Messages", "index": 6, "depth": 2, "title": "XCM Router", "anchor": "xcm-router", "start_char": 6081, "end_char": 7032, "estimated_token_count": 225, "token_estimator": "heuristic-v1", "text": "## XCM Router\n\nThe [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/trait.Config.html#associatedtype.XcmRouter){target=\\_blank} is a critical component the XCM pallet requires to facilitate sending XCM messages. It defines where messages can be sent and determines the appropriate XCM transport protocol for the operation.\n\nFor instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\\_blank} from the relay chain to parachains, ensuring secure and controlled communication.\n\n```rust\n\n```\n\nFor more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\\_blank} page."} +{"page_id": "develop-interoperability-send-messages", "page_title": "Send XCM Messages", "index": 6, "depth": 2, "title": "XCM Router", "anchor": "xcm-router", "start_char": 6081, "end_char": 7146, "estimated_token_count": 240, "token_estimator": "heuristic-v1", "text": "## XCM Router\n\nThe [`XcmRouter`](https://paritytech.github.io/polkadot-sdk/master/pallet_xcm/pallet/trait.Config.html#associatedtype.XcmRouter){target=\\_blank} is a critical component the XCM pallet requires to facilitate sending XCM messages. It defines where messages can be sent and determines the appropriate XCM transport protocol for the operation.\n\nFor instance, the Kusama network employs the [`ChildParachainRouter`](https://paritytech.github.io/polkadot-sdk/master/polkadot_runtime_common/xcm_sender/struct.ChildParachainRouter.html){target=\\_blank}, which restricts routing to [Downward Message Passing (DMP)](https://wiki.polkadot.com/learn/learn-xcm-transport/#dmp-downward-message-passing){target=\\_blank} from the relay chain to parachains, ensuring secure and controlled communication.\n\n```rust\npub type PriceForChildParachainDelivery =\n\tExponentialPrice;\n```\n\nFor more details about XCM transport protocols, see the [XCM Channels](/develop/interoperability/xcm-channels/){target=\\_blank} page."} {"page_id": "develop-interoperability-test-and-debug", "page_title": "Testing and Debugging", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 25, "end_char": 875, "estimated_token_count": 162, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nCross-Consensus Messaging (XCM) is a core feature of the Polkadot ecosystem, enabling communication between parachains, relay chains, and system chains. To ensure the reliability of XCM-powered blockchains, thorough testing and debugging are essential before production deployment.\n\nThis guide covers the XCM Emulator, a tool designed to facilitate onboarding and testing for developers. Use the emulator if:\n\n- A live runtime is not yet available.\n- Extensive configuration adjustments are needed, as emulated chains differ from live networks.\n- Rust-based tests are preferred for automation and integration.\n\nFor scenarios where real blockchain state is required, [Chopsticks](/tutorials/polkadot-sdk/testing/fork-live-chains/#xcm-testing){target=\\_blank} allows testing with any client compatible with Polkadot SDK-based chains."} {"page_id": "develop-interoperability-test-and-debug", "page_title": "Testing and Debugging", "index": 1, "depth": 2, "title": "XCM Emulator", "anchor": "xcm-emulator", "start_char": 875, "end_char": 2026, "estimated_token_count": 225, "token_estimator": "heuristic-v1", "text": "## XCM Emulator\n\nSetting up a live network with multiple interconnected parachains for XCM testing can be complex and resource-intensive. \n\nThe [`xcm-emulator`](https://github.com/paritytech/polkadot-sdk/tree/polkadot-stable2506-2/cumulus/xcm/xcm-emulator){target=\\_blank} is a tool designed to simulate the execution of XCM programs using predefined runtime configurations. These configurations include those utilized by live networks like Kusama, Polkadot, and Asset Hub.\n\nThis tool enables testing of cross-chain message passing, providing a way to verify outcomes, weights, and side effects efficiently. It achieves this by utilizing mocked runtimes for both the relay chain and connected parachains, enabling developers to focus on message logic and configuration without needing a live network.\n\nThe `xcm-emulator` relies on transport layer pallets. However, the messages do not leverage the same messaging infrastructure as live networks since the transport mechanism is mocked. Additionally, consensus-related events are not covered, such as disputes and staking events. Parachains should use end-to-end (E2E) tests to validate these events."} {"page_id": "develop-interoperability-test-and-debug", "page_title": "Testing and Debugging", "index": 2, "depth": 3, "title": "Advantages and Limitations", "anchor": "advantages-and-limitations", "start_char": 2026, "end_char": 3130, "estimated_token_count": 219, "token_estimator": "heuristic-v1", "text": "### Advantages and Limitations\n\nThe XCM Emulator provides both advantages and limitations when testing cross-chain communication in simulated environments.\n\n- **Advantages**:\n - **Interactive debugging**: Offers tracing capabilities similar to EVM, enabling detailed analysis of issues.\n - **Runtime composability**: Facilitates testing and integration of multiple runtime components.\n - **Immediate feedback**: Supports Test-Driven Development (TDD) by providing rapid test results.\n - **Seamless integration testing**: Simplifies the process of testing new runtime versions in an isolated environment.\n\n- **Limitations**:\n - **Simplified emulation**: Always assumes message delivery, which may not mimic real-world network behavior.\n - **Dependency challenges**: Requires careful management of dependency versions and patching. Refer to the [Cargo dependency documentation](https://doc.rust-lang.org/cargo/reference/overriding-dependencies.html){target=\\_blank}.\n - **Compilation overhead**: Testing environments can be resource-intensive, requiring frequent compilation updates."} -{"page_id": "develop-interoperability-test-and-debug", "page_title": "Testing and Debugging", "index": 3, "depth": 3, "title": "How Does It Work?", "anchor": "how-does-it-work", "start_char": 3130, "end_char": 5974, "estimated_token_count": 665, "token_estimator": "heuristic-v1", "text": "### How Does It Work?\n\nThe `xcm-emulator` provides macros for defining a mocked testing environment. Check all the existing macros and functionality in the [XCM Emulator source code](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs){target=\\_blank}. The most important macros are:\n\n- **[`decl_test_relay_chains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L361){target=\\_blank}**: Defines runtime and configuration for the relay chains. Example:\n\n ```rust\n decl_test_relay_chains! {\n \t#[api_version(13)]\n \tpub struct Westend {\n \t\tgenesis = genesis::genesis(),\n \t\ton_init = (),\n \t\truntime = westend_runtime,\n \t\tcore = {\n \t\t\tSovereignAccountOf: westend_runtime::xcm_config::LocationConverter,\n \t\t},\n \t\tpallets = {\n \t\t\tXcmPallet: westend_runtime::XcmPallet,\n \t\t\tSudo: westend_runtime::Sudo,\n \t\t\tBalances: westend_runtime::Balances,\n \t\t\tTreasury: westend_runtime::Treasury,\n \t\t\tAssetRate: westend_runtime::AssetRate,\n \t\t\tHrmp: westend_runtime::Hrmp,\n \t\t\tIdentity: westend_runtime::Identity,\n \t\t\tIdentityMigrator: westend_runtime::IdentityMigrator,\n \t\t}\n \t},\n }\n ```\n\n- **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\\_blank}**: Defines runtime and configuration for parachains. Example:\n\n ```rust\n \n ```\n\n- **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example:\n\n ```rust\n \n ```\n\n- **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example:\n\n ```rust\n \n ```\n\nBy leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\\_blank} article. \n\nThis framework enables thorough testing of runtime and cross-chain interactions, enabling developers to effectively design, test, and optimize cross-chain functionality.\n\nTo see a complete example of implementing and executing tests, refer to the [integration tests](https://github.com/paritytech/polkadot-sdk/tree/polkadot-stable2506-2/cumulus/parachains/integration-tests/emulated){target=\\_blank} in the Polkadot SDK repository."} +{"page_id": "develop-interoperability-test-and-debug", "page_title": "Testing and Debugging", "index": 3, "depth": 3, "title": "How Does It Work?", "anchor": "how-does-it-work", "start_char": 3130, "end_char": 7729, "estimated_token_count": 881, "token_estimator": "heuristic-v1", "text": "### How Does It Work?\n\nThe `xcm-emulator` provides macros for defining a mocked testing environment. Check all the existing macros and functionality in the [XCM Emulator source code](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs){target=\\_blank}. The most important macros are:\n\n- **[`decl_test_relay_chains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L361){target=\\_blank}**: Defines runtime and configuration for the relay chains. Example:\n\n ```rust\n decl_test_relay_chains! {\n \t#[api_version(13)]\n \tpub struct Westend {\n \t\tgenesis = genesis::genesis(),\n \t\ton_init = (),\n \t\truntime = westend_runtime,\n \t\tcore = {\n \t\t\tSovereignAccountOf: westend_runtime::xcm_config::LocationConverter,\n \t\t},\n \t\tpallets = {\n \t\t\tXcmPallet: westend_runtime::XcmPallet,\n \t\t\tSudo: westend_runtime::Sudo,\n \t\t\tBalances: westend_runtime::Balances,\n \t\t\tTreasury: westend_runtime::Treasury,\n \t\t\tAssetRate: westend_runtime::AssetRate,\n \t\t\tHrmp: westend_runtime::Hrmp,\n \t\t\tIdentity: westend_runtime::Identity,\n \t\t\tIdentityMigrator: westend_runtime::IdentityMigrator,\n \t\t}\n \t},\n }\n ```\n\n- **[`decl_test_parachains`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L596){target=\\_blank}**: Defines runtime and configuration for parachains. Example:\n\n ```rust\n decl_test_parachains! {\n \tpub struct AssetHubWestend {\n \t\tgenesis = genesis::genesis(),\n \t\ton_init = {\n \t\t\tasset_hub_westend_runtime::AuraExt::on_initialize(1);\n \t\t},\n \t\truntime = asset_hub_westend_runtime,\n \t\tcore = {\n \t\t\tXcmpMessageHandler: asset_hub_westend_runtime::XcmpQueue,\n \t\t\tLocationToAccountId: asset_hub_westend_runtime::xcm_config::LocationToAccountId,\n \t\t\tParachainInfo: asset_hub_westend_runtime::ParachainInfo,\n \t\t\tMessageOrigin: cumulus_primitives_core::AggregateMessageOrigin,\n \t\t\tDigestProvider: (),\n \t\t},\n \t\tpallets = {\n \t\t\tPolkadotXcm: asset_hub_westend_runtime::PolkadotXcm,\n \t\t\tBalances: asset_hub_westend_runtime::Balances,\n \t\t\tAssets: asset_hub_westend_runtime::Assets,\n \t\t\tForeignAssets: asset_hub_westend_runtime::ForeignAssets,\n \t\t\tPoolAssets: asset_hub_westend_runtime::PoolAssets,\n \t\t\tAssetConversion: asset_hub_westend_runtime::AssetConversion,\n \t\t\tSnowbridgeSystemFrontend: asset_hub_westend_runtime::SnowbridgeSystemFrontend,\n \t\t\tRevive: asset_hub_westend_runtime::Revive,\n \t\t}\n \t},\n }\n ```\n\n- **[`decl_test_bridges`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L1221){target=\\_blank}**: Creates bridges between chains, specifying the source, target, and message handler. Example:\n\n ```rust\n decl_test_bridges! {\n \tpub struct RococoWestendMockBridge {\n \t\tsource = BridgeHubRococoPara,\n \t\ttarget = BridgeHubWestendPara,\n \t\thandler = RococoWestendMessageHandler\n \t},\n \tpub struct WestendRococoMockBridge {\n \t\tsource = BridgeHubWestendPara,\n \t\ttarget = BridgeHubRococoPara,\n \t\thandler = WestendRococoMessageHandler\n \t}\n }\n ```\n\n- **[`decl_test_networks`](https://github.com/paritytech/polkadot-sdk/blob/polkadot-stable2506-2/cumulus/xcm/xcm-emulator/src/lib.rs#L958){target=\\_blank}**: Defines a testing network with relay chains, parachains, and bridges, implementing message transport and processing logic. Example:\n\n ```rust\n decl_test_networks! {\n \tpub struct WestendMockNet {\n \t\trelay_chain = Westend,\n \t\tparachains = vec![\n \t\t\tAssetHubWestend,\n \t\t\tBridgeHubWestend,\n \t\t\tCollectivesWestend,\n \t\t\tCoretimeWestend,\n \t\t\tPeopleWestend,\n \t\t\tPenpalA,\n \t\t\tPenpalB,\n \t\t],\n \t\tbridge = ()\n \t},\n }\n ```\n\nBy leveraging these macros, developers can customize their testing networks by defining relay chains and parachains tailored to their needs. For guidance on implementing a mock runtime for a Polkadot SDK-based chain, refer to the [Pallet Testing](/develop/parachains/testing/pallet-testing/){target=\\_blank} article. \n\nThis framework enables thorough testing of runtime and cross-chain interactions, enabling developers to effectively design, test, and optimize cross-chain functionality.\n\nTo see a complete example of implementing and executing tests, refer to the [integration tests](https://github.com/paritytech/polkadot-sdk/tree/polkadot-stable2506-2/cumulus/parachains/integration-tests/emulated){target=\\_blank} in the Polkadot SDK repository."} {"page_id": "develop-interoperability-versions-v5-asset-claimer", "page_title": "Asset claimer", "index": 0, "depth": 2, "title": "The problem before v5", "anchor": "the-problem-before-v5", "start_char": 446, "end_char": 932, "estimated_token_count": 102, "token_estimator": "heuristic-v1", "text": "## The problem before v5\n\nWhen XCM execution failed and assets became trapped:\n\n- **Governance dependency**: Most trapped asset recovery requires governance proposals.\n- **Complex procedures**: Manual intervention through referendum processes.\n- **Long delays**: Recovery could take weeks or months through governance.\n- **Risk of loss**: Assets could remain permanently trapped if governance didn't act.\n- **High barriers**: Small amounts often weren't worth the governance overhead."} {"page_id": "develop-interoperability-versions-v5-asset-claimer", "page_title": "Asset claimer", "index": 1, "depth": 2, "title": "The V5 Solution: `AssetClaimer` Hint", "anchor": "the-v5-solution-assetclaimer-hint", "start_char": 932, "end_char": 1343, "estimated_token_count": 101, "token_estimator": "heuristic-v1", "text": "## The V5 Solution: `AssetClaimer` Hint\n\nThe new [`AssetClaimer`](https://paritytech.github.io/polkadot-sdk/master/staging_xcm/v5/enum.Hint.html#variant.AssetClaimer){target=\\_blank} hint allows XCM programs to preemptively designate who can claim trapped assets:\n\n```typescript\n// Set asset claimer before risky operations\nXcmV5Instruction.SetHints({ \n hints: [Enum('AssetClaimer', claimerLocation)] \n})\n```"} {"page_id": "develop-interoperability-versions-v5-asset-claimer", "page_title": "Asset claimer", "index": 2, "depth": 2, "title": "How it Improves the Situation", "anchor": "how-it-improves-the-situation", "start_char": 1343, "end_char": 2490, "estimated_token_count": 222, "token_estimator": "heuristic-v1", "text": "## How it Improves the Situation\n\nThe `AssetClaimer` hint transforms the recovery process by allowing proactive designation of claimers, eliminating the need for governance intervention in most cases.\n\n- **Before XCM V5:**\n\n ```typescript\n // If this XCM fails, assets become trapped\n const riskyXcm = [\n XcmInstruction.WithdrawAsset([assets]),\n XcmInstruction.BuyExecution({ fees, weight_limit }),\n XcmInstruction.Transact({ /* risky call */ }),\n XcmInstruction.DepositAsset({ assets, beneficiary })\n ]\n\n // Recovery required governance intervention\n ```\n\n- **With XCM V5:**\n\n ```typescript\n // Proactive asset claimer setup\n const saferXcm = [\n // Anyone can now claim if execution fails\n XcmV5Instruction.SetHints({ \n hints: [Enum('AssetClaimer', claimerLocation)] \n }),\n XcmV5Instruction.WithdrawAsset([assets]),\n XcmV5Instruction.PayFees({ asset }),\n XcmV5Instruction.Transact({ /* risky call */ }),\n XcmV5Instruction.DepositAsset({ assets, beneficiary })\n ]\n\n // Recovery can be done immediately by the claimer\n ```"} @@ -107,13 +107,13 @@ {"page_id": "develop-interoperability-xcm-guides", "page_title": "XCM Guides", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 516, "end_char": 1619, "estimated_token_count": 316, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 20, "end_char": 932, "estimated_token_count": 159, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nRuntime APIs allow node-side code to extract information from the runtime state. While simple storage access retrieves stored values directly, runtime APIs enable arbitrary computation, making them a powerful tool for interacting with the chain's state.\n\nUnlike direct storage access, runtime APIs can derive values from storage based on arguments or perform computations that don't require storage access. For example, a runtime API might expose a formula for fee calculation, using only the provided arguments as inputs rather than fetching data from storage.\n\nIn general, runtime APIs are used for:\n\n- Accessing a storage item.\n- Retrieving a bundle of related storage items.\n- Deriving a value from storage based on arguments.\n- Exposing formulas for complex computational calculations.\n\nThis section will teach you about specific runtime APIs that support XCM processing and manipulation."} {"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 1, "depth": 2, "title": "Dry Run API", "anchor": "dry-run-api", "start_char": 932, "end_char": 1492, "estimated_token_count": 140, "token_estimator": "heuristic-v1", "text": "## Dry Run API\n\nThe [Dry-run API](https://paritytech.github.io/polkadot-sdk/master/xcm_runtime_apis/dry_run/trait.DryRunApi.html){target=\\_blank}, given an extrinsic, or an XCM program, returns its effects:\n\n- Execution result\n- Local XCM (in the case of an extrinsic)\n- Forwarded XCMs\n- List of events\n\nThis API can be used independently for dry-running, double-checking, or testing. However, it mainly shines when used with the [Xcm Payment API](#xcm-payment-api), given that it only estimates fees if you know the specific XCM you want to execute or send."} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 2, "depth": 3, "title": "Dry Run Call", "anchor": "dry-run-call", "start_char": 1492, "end_char": 10303, "estimated_token_count": 1620, "token_estimator": "heuristic-v1", "text": "### Dry Run Call\n\nThis API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains.\n\n```rust\n\n```\n\n??? interface \"Input parameters\"\n\n `origin` ++\"OriginCaller\"++ ++\"required\"++\n \n The origin used for executing the transaction.\n\n ---\n\n `call` ++\"Call\"++ ++\"required\"++\n\n The extrinsic to be executed.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n Effects of dry-running an extrinsic. If an error occurs, it is returned instead of the effects.\n\n ??? child \"Type `CallDryRunEffects`\"\n\n `execution_result` ++\"DispatchResultWithPostInfo\"++\n\n The result of executing the extrinsic.\n\n ---\n\n `emitted_events` ++\"Vec\"++\n\n The list of events fired by the extrinsic.\n\n ---\n\n `local_xcm` ++\"Option>\"++\n\n The local XCM that was attempted to be executed, if any.\n\n ---\n\n `forwarded_xcms` ++\"Vec<(VersionedLocation, Vec>)>\"++\n\n The list of XCMs that were queued for sending.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n\n??? interface \"Example\"\n\n This example demonstrates how to simulate a cross-chain asset transfer from the Paseo network to the Pop Network using a [reserve transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#reserve-asset-transfer){target=\\_blank} mechanism. Instead of executing the actual transfer, the code shows how to test and verify the transaction's behavior through a dry run before performing it on the live network.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { paseo } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n PolkadotRuntimeOriginCaller,\n XcmVersionedLocation,\n XcmVersionedAssets,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n } from '@polkadot-api/descriptors';\n import { DispatchRawOrigin } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to the Paseo relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')),\n );\n\n const paseoApi = client.getTypedApi(paseo);\n\n const popParaID = 4001;\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the origin caller\n // This is a regular signed account owned by a user\n let origin = PolkadotRuntimeOriginCaller.system(\n DispatchRawOrigin.Signed(userAddress),\n );\n\n // Define a transaction to transfer assets from Polkadot to Pop Network using a Reserve Transfer\n const tx = paseoApi.tx.XcmPallet.limited_reserve_transfer_assets({\n dest: XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.Parachain(popParaID), // Destination is the Pop Network parachain\n ),\n }),\n beneficiary: XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n // Beneficiary address on Pop Network\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n }),\n assets: XcmVersionedAssets.V3([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 0,\n interior: XcmV3Junctions.Here(), // Native asset from the sender. In this case PAS\n }),\n fun: XcmV3MultiassetFungibility.Fungible(120000000000n), // Asset amount to transfer\n },\n ]),\n fee_asset_item: 0, // Asset used to pay transaction fees\n weight_limit: XcmV3WeightLimit.Unlimited(), // No weight limit on transaction\n });\n\n // Execute the dry run call to simulate the transaction\n const dryRunResult = await paseoApi.apis.DryRunApi.dry_run_call(\n origin,\n tx.decodedCall,\n );\n\n // Extract the data from the dry run result\n const {\n execution_result: executionResult,\n emitted_events: emmittedEvents,\n local_xcm: localXcm,\n forwarded_xcms: forwardedXcms,\n } = dryRunResult.value;\n\n // Extract the XCM generated by this call\n const xcmsToPop = forwardedXcms.find(\n ([location, _]) =>\n location.type === 'V4' &&\n location.value.parents === 0 &&\n location.value.interior.type === 'X1' &&\n location.value.interior.value.type === 'Parachain' &&\n location.value.interior.value.value === popParaID, // Pop network's ParaID\n );\n const destination = xcmsToPop[0];\n const remoteXcm = xcmsToPop[1][0];\n\n // Print the results\n const resultObject = {\n execution_result: executionResult,\n emitted_events: emmittedEvents,\n local_xcm: localXcm,\n destination: destination,\n remote_xcm: remoteXcm,\n };\n\n console.dir(resultObject, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          execution_result: {\n            success: true,\n            value: {\n              actual_weight: undefined,\n              pays_fee: { type: 'Yes', value: undefined }\n            }\n          },\n          emitted_events: [\n                ...\n          ],\n          local_xcm: undefined,\n          destination: {\n            type: 'V4',\n            value: {\n              parents: 0,\n              interior: { type: 'X1', value: { type: 'Parachain', value: 4001 } }\n            }\n          },\n          remote_xcm: {\n            type: 'V3',\n            value: [\n              {\n                type: 'ReserveAssetDeposited',\n                value: [\n                  {\n                    id: {\n                      type: 'Concrete',\n                      value: {\n                        parents: 1,\n                        interior: { type: 'Here', value: undefined }\n                      }\n                    },\n                    fun: { type: 'Fungible', value: 120000000000n }\n                  }\n                ]\n              },\n              { type: 'ClearOrigin', value: undefined },\n              {\n                type: 'BuyExecution',\n                value: {\n                  fees: {\n                    id: {\n                      type: 'Concrete',\n                      value: {\n                        parents: 1,\n                        interior: { type: 'Here', value: undefined }\n                      }\n                    },\n                    fun: { type: 'Fungible', value: 120000000000n }\n                  },\n                  weight_limit: { type: 'Unlimited', value: undefined }\n                }\n              },\n              {\n                type: 'DepositAsset',\n                value: {\n                  assets: { type: 'Wild', value: { type: 'AllCounted', value: 1 } },\n                  beneficiary: {\n                    parents: 0,\n                    interior: {\n                      type: 'X1',\n                      value: {\n                        type: 'AccountId32',\n                        value: {\n                          network: undefined,\n                          id: FixedSizeBinary {\n                            asText: [Function (anonymous)],\n                            asHex: [Function (anonymous)],\n                            asOpaqueHex: [Function (anonymous)],\n                            asBytes: [Function (anonymous)],\n                            asOpaqueBytes: [Function (anonymous)]\n                          }\n                        }\n                      }\n                    }\n                  }\n                }\n              },\n              {\n                type: 'SetTopic',\n                value: FixedSizeBinary {\n                  asText: [Function (anonymous)],\n                  asHex: [Function (anonymous)],\n                  asOpaqueHex: [Function (anonymous)],\n                  asBytes: [Function (anonymous)],\n                  asOpaqueBytes: [Function (anonymous)]\n                }\n              }\n            ]\n          }\n        }      \n      
\n
\n\n ---"} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 3, "depth": 3, "title": "Dry Run XCM", "anchor": "dry-run-xcm", "start_char": 10303, "end_char": 16591, "estimated_token_count": 1150, "token_estimator": "heuristic-v1", "text": "### Dry Run XCM\n\nThis API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains.\n\n```rust\n\n```\n\n??? interface \"Input parameters\"\n\n `origin_location` ++\"VersionedLocation\"++ ++\"required\"++\n\n The location of the origin that will execute the xcm message.\n\n ---\n\n `xcm` ++\"VersionedXcm\"++ ++\"required\"++\n\n A versioned XCM message.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n Effects of dry-running an extrinsic. If an error occurs, it is returned instead of the effects.\n\n ??? child \"Type `XcmDryRunEffects`\"\n\n `execution_result` ++\"DispatchResultWithPostInfo\"++\n\n The result of executing the extrinsic.\n\n ---\n\n `emitted_events` ++\"Vec\"++\n\n The list of events fired by the extrinsic.\n\n ---\n\n `forwarded_xcms` ++\"Vec<(VersionedLocation, Vec>)>\"++\n\n The list of XCMs that were queued for sending.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to simulate a [teleport asset transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#asset-teleportation){target=\\_blank} from the Paseo network to the Paseo Asset Hub parachain. The code shows how to test and verify the received XCM message's behavior in the destination chain through a dry run on the live network.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseoAssetHub,\n XcmVersionedLocation,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to Paseo Asset Hub\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the origin\n const origin = XcmVersionedLocation.V3({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n });\n\n // Define a xcm message comming from the Paseo relay chain to Asset Hub to Teleport some tokens\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute dry run xcm\n const dryRunResult = await paseoAssetHubApi.apis.DryRunApi.dry_run_xcm(\n origin,\n xcm,\n );\n\n // Print the results\n console.dir(dryRunResult.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          execution_result: {\n            type: 'Complete',\n            value: { used: { ref_time: 15574200000n, proof_size: 359300n } }\n          },\n          emitted_events: [\n            {\n              type: 'System',\n              value: {\n                type: 'NewAccount',\n                value: { account: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET' }\n              }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Endowed',\n                value: {\n                  account: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',\n                  free_balance: 10203500000n\n                }\n              }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Minted',\n                value: {\n                  who: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',\n                  amount: 10203500000n\n                }\n              }\n            },\n            {\n              type: 'Balances',\n              value: { type: 'Issued', value: { amount: 1796500000n } }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Deposit',\n                value: {\n                  who: '13UVJyLgBASGhE2ok3TvxUfaQBGUt88JCcdYjHvUhvQkFTTx',\n                  amount: 1796500000n\n                }\n              }\n            }\n          ],\n          forwarded_xcms: [\n            [\n              {\n                type: 'V4',\n                value: { parents: 1, interior: { type: 'Here', value: undefined } }\n              },\n              []\n            ]\n          ]\n        }\n      
\n
\n\n ---"} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 4, "depth": 2, "title": "XCM Payment API", "anchor": "xcm-payment-api", "start_char": 16591, "end_char": 17677, "estimated_token_count": 222, "token_estimator": "heuristic-v1", "text": "## XCM Payment API\n\nThe [XCM Payment API](https://paritytech.github.io/polkadot-sdk/master/xcm_runtime_apis/fees/trait.XcmPaymentApi.html){target=\\_blank} provides a standardized way to determine the costs and payment options for executing XCM messages. Specifically, it enables clients to:\n\n- Retrieve the [weight](/polkadot-protocol/glossary/#weight) required to execute an XCM message.\n- Obtain a list of acceptable `AssetIds` for paying execution fees.\n- Calculate the cost of the weight in a specified `AssetId`.\n- Estimate the fees for XCM message delivery.\n\nThis API eliminates the need for clients to guess execution fees or identify acceptable assets manually. Instead, clients can query the list of supported asset IDs formatted according to the XCM version they understand. With this information, they can weigh the XCM program they intend to execute and convert the computed weight into its cost using one of the acceptable assets.\n\nTo use the API effectively, the client must already know the XCM program to be executed and the chains involved in the program's execution."} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 5, "depth": 3, "title": "Query Acceptable Payment Assets", "anchor": "query-acceptable-payment-assets", "start_char": 17677, "end_char": 20226, "estimated_token_count": 563, "token_estimator": "heuristic-v1", "text": "### Query Acceptable Payment Assets\n\nRetrieves the list of assets that are acceptable for paying fees when using a specific XCM version\n\n```rust\n\n```\n\n??? interface \"Input parameters\"\n\n `xcm_version` ++\"Version\"++ ++\"required\"++\n\n Specifies the XCM version that will be used to send the XCM message.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n A list of acceptable payment assets. Each asset is provided in a versioned format (`VersionedAssetId`) that matches the specified XCM version. If an error occurs, it is returned instead of the asset list.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to query the acceptable payment assets for executing XCM messages on the Paseo Asset Hub network using XCM version 3.\n\n ***Usage with PAPI***\n\n ```js\n import { paseoAssetHub } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n\n // Connect to the polkadot relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n // Define the xcm version to use\n const xcmVersion = 3;\n\n // Execute the runtime call to query the assets\n const result =\n await paseoAssetHubApi.apis.XcmPaymentApi.query_acceptable_payment_assets(\n xcmVersion,\n );\n\n // Print the assets\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        [\n          {\n            type: 'V3',\n            value: {\n              type: 'Concrete',\n              value: { parents: 1, interior: { type: 'Here', value: undefined } }\n            }\n          }\n        ]\n      
\n
\n\n ---"} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 6, "depth": 3, "title": "Query XCM Weight", "anchor": "query-xcm-weight", "start_char": 20226, "end_char": 24827, "estimated_token_count": 922, "token_estimator": "heuristic-v1", "text": "### Query XCM Weight\n\nCalculates the weight required to execute a given XCM message. It is useful for estimating the execution cost of a cross-chain message in the destination chain before sending it.\n\n```rust\nfn query_xcm_weight(message: VersionedXcm<()>) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `message` ++\"VersionedXcm<()>\"++ ++\"required\"++\n \n A versioned XCM message whose execution weight is being queried.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The calculated weight required to execute the provided XCM message. If the calculation fails, an error is returned instead.\n\n ??? child \"Type `Weight`\"\n\n `ref_time` ++\"u64\"++\n\n The weight of computational time used based on some reference hardware.\n\n ---\n\n `proof_size` ++\"u64\"++\n\n The weight of storage space used by proof of validity.\n\n ---\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to calculate the weight needed to execute a [teleport transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#asset-teleportation){target=\\_blank} from the Paseo network to the Paseo Asset Hub parachain using the XCM Payment API. The result shows the required weight in terms of reference time and proof size needed in the destination chain.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseoAssetHub,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to Paseo Asset Hub\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define a xcm message comming from the Paseo relay chain to Asset Hub to Teleport some tokens\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute the query weight runtime call\n const result = await paseoAssetHubApi.apis.XcmPaymentApi.query_xcm_weight(xcm);\n\n // Print the results\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n { ref_time: 15574200000n, proof_size: 359300n }\n
\n\n ---"} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 7, "depth": 3, "title": "Query Weight to Asset Fee", "anchor": "query-weight-to-asset-fee", "start_char": 24827, "end_char": 27869, "estimated_token_count": 699, "token_estimator": "heuristic-v1", "text": "### Query Weight to Asset Fee\n\nConverts a given weight into the corresponding fee for a specified `AssetId`. It allows clients to determine the cost of execution in terms of the desired asset.\n\n```rust\nfn query_weight_to_asset_fee(weight: Weight, asset: VersionedAssetId) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `weight` ++\"Weight\"++ ++\"required\"++\n \n The execution weight to be converted into a fee.\n\n ??? child \"Type `Weight`\"\n\n `ref_time` ++\"u64\"++\n\n The weight of computational time used based on some reference hardware.\n\n ---\n\n `proof_size` ++\"u64\"++\n\n The weight of storage space used by proof of validity.\n\n ---\n\n ---\n\n `asset` ++\"VersionedAssetId\"++ ++\"required\"++\n \n The asset in which the fee will be calculated. This must be a versioned asset ID compatible with the runtime.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The fee needed to pay for the execution for the given `AssetId.`\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to calculate the fee for a given execution weight using a specific versioned asset ID (PAS token) on Paseo Asset Hub.\n\n ***Usage with PAPI***\n\n ```js\n import { paseoAssetHub } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n\n // Connect to the polkadot relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n // Define the weight to convert to fee\n const weight = { ref_time: 15574200000n, proof_size: 359300n };\n\n // Define the versioned asset id\n const versionedAssetId = {\n type: 'V4',\n value: { parents: 1, interior: { type: 'Here', value: undefined } },\n };\n\n // Execute the runtime call to convert the weight to fee\n const result =\n await paseoAssetHubApi.apis.XcmPaymentApi.query_weight_to_asset_fee(\n weight,\n versionedAssetId,\n );\n\n // Print the fee\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n 1796500000n\n
\n\n ---"} -{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 8, "depth": 3, "title": "Query Delivery Fees", "anchor": "query-delivery-fees", "start_char": 27869, "end_char": 32843, "estimated_token_count": 965, "token_estimator": "heuristic-v1", "text": "### Query Delivery Fees\n\nRetrieves the delivery fees for sending a specific XCM message to a designated destination. The fees are always returned in a specific asset defined by the destination chain.\n\n```rust\nfn query_delivery_fees(destination: VersionedLocation, message: VersionedXcm<()>) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `destination` ++\"VersionedLocation\"++ ++\"required\"++\n \n The target location where the message will be sent. Fees may vary depending on the destination, as different destinations often have unique fee structures and sender mechanisms.\n\n ---\n\n `message` ++\"VersionedXcm<()>\"++ ++\"required\"++\n \n The XCM message to be sent. The delivery fees are calculated based on the message's content and size, which can influence the cost.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The calculated delivery fees expressed in a specific asset supported by the destination chain. If an error occurs during the query, it returns an error instead.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to query the delivery fees for sending an XCM message from Paseo to Paseo Asset Hub.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseo,\n XcmVersionedLocation,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')),\n );\n\n const paseoApi = client.getTypedApi(paseo);\n\n const paseoAssetHubParaID = 1000;\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the destination\n const destination = XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(XcmV3Junction.Parachain(paseoAssetHubParaID)),\n });\n\n // Define the xcm message that will be sent to the destination\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute the query delivery fees runtime call\n const result = await paseoApi.apis.XcmPaymentApi.query_delivery_fees(\n destination,\n xcm,\n );\n\n // Print the results\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          type: 'V3',\n          value: [\n            {\n              id: {\n                type: 'Concrete',\n                value: { parents: 0, interior: { type: 'Here', value: undefined } }\n              },\n              fun: { type: 'Fungible', value: 396000000n }\n            }\n          ]\n        }\n      
\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 2, "depth": 3, "title": "Dry Run Call", "anchor": "dry-run-call", "start_char": 1492, "end_char": 10429, "estimated_token_count": 1647, "token_estimator": "heuristic-v1", "text": "### Dry Run Call\n\nThis API allows a dry-run of any extrinsic and obtaining the outcome if it fails or succeeds, as well as the local xcm and remote xcm messages sent to other chains.\n\n```rust\nfn dry_run_call(origin: OriginCaller, call: Call, result_xcms_version: XcmVersion) -> Result, Error>;\n```\n\n??? interface \"Input parameters\"\n\n `origin` ++\"OriginCaller\"++ ++\"required\"++\n \n The origin used for executing the transaction.\n\n ---\n\n `call` ++\"Call\"++ ++\"required\"++\n\n The extrinsic to be executed.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n Effects of dry-running an extrinsic. If an error occurs, it is returned instead of the effects.\n\n ??? child \"Type `CallDryRunEffects`\"\n\n `execution_result` ++\"DispatchResultWithPostInfo\"++\n\n The result of executing the extrinsic.\n\n ---\n\n `emitted_events` ++\"Vec\"++\n\n The list of events fired by the extrinsic.\n\n ---\n\n `local_xcm` ++\"Option>\"++\n\n The local XCM that was attempted to be executed, if any.\n\n ---\n\n `forwarded_xcms` ++\"Vec<(VersionedLocation, Vec>)>\"++\n\n The list of XCMs that were queued for sending.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n\n??? interface \"Example\"\n\n This example demonstrates how to simulate a cross-chain asset transfer from the Paseo network to the Pop Network using a [reserve transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#reserve-asset-transfer){target=\\_blank} mechanism. Instead of executing the actual transfer, the code shows how to test and verify the transaction's behavior through a dry run before performing it on the live network.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { paseo } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n PolkadotRuntimeOriginCaller,\n XcmVersionedLocation,\n XcmVersionedAssets,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n } from '@polkadot-api/descriptors';\n import { DispatchRawOrigin } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to the Paseo relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')),\n );\n\n const paseoApi = client.getTypedApi(paseo);\n\n const popParaID = 4001;\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the origin caller\n // This is a regular signed account owned by a user\n let origin = PolkadotRuntimeOriginCaller.system(\n DispatchRawOrigin.Signed(userAddress),\n );\n\n // Define a transaction to transfer assets from Polkadot to Pop Network using a Reserve Transfer\n const tx = paseoApi.tx.XcmPallet.limited_reserve_transfer_assets({\n dest: XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.Parachain(popParaID), // Destination is the Pop Network parachain\n ),\n }),\n beneficiary: XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n // Beneficiary address on Pop Network\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n }),\n assets: XcmVersionedAssets.V3([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 0,\n interior: XcmV3Junctions.Here(), // Native asset from the sender. In this case PAS\n }),\n fun: XcmV3MultiassetFungibility.Fungible(120000000000n), // Asset amount to transfer\n },\n ]),\n fee_asset_item: 0, // Asset used to pay transaction fees\n weight_limit: XcmV3WeightLimit.Unlimited(), // No weight limit on transaction\n });\n\n // Execute the dry run call to simulate the transaction\n const dryRunResult = await paseoApi.apis.DryRunApi.dry_run_call(\n origin,\n tx.decodedCall,\n );\n\n // Extract the data from the dry run result\n const {\n execution_result: executionResult,\n emitted_events: emmittedEvents,\n local_xcm: localXcm,\n forwarded_xcms: forwardedXcms,\n } = dryRunResult.value;\n\n // Extract the XCM generated by this call\n const xcmsToPop = forwardedXcms.find(\n ([location, _]) =>\n location.type === 'V4' &&\n location.value.parents === 0 &&\n location.value.interior.type === 'X1' &&\n location.value.interior.value.type === 'Parachain' &&\n location.value.interior.value.value === popParaID, // Pop network's ParaID\n );\n const destination = xcmsToPop[0];\n const remoteXcm = xcmsToPop[1][0];\n\n // Print the results\n const resultObject = {\n execution_result: executionResult,\n emitted_events: emmittedEvents,\n local_xcm: localXcm,\n destination: destination,\n remote_xcm: remoteXcm,\n };\n\n console.dir(resultObject, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          execution_result: {\n            success: true,\n            value: {\n              actual_weight: undefined,\n              pays_fee: { type: 'Yes', value: undefined }\n            }\n          },\n          emitted_events: [\n                ...\n          ],\n          local_xcm: undefined,\n          destination: {\n            type: 'V4',\n            value: {\n              parents: 0,\n              interior: { type: 'X1', value: { type: 'Parachain', value: 4001 } }\n            }\n          },\n          remote_xcm: {\n            type: 'V3',\n            value: [\n              {\n                type: 'ReserveAssetDeposited',\n                value: [\n                  {\n                    id: {\n                      type: 'Concrete',\n                      value: {\n                        parents: 1,\n                        interior: { type: 'Here', value: undefined }\n                      }\n                    },\n                    fun: { type: 'Fungible', value: 120000000000n }\n                  }\n                ]\n              },\n              { type: 'ClearOrigin', value: undefined },\n              {\n                type: 'BuyExecution',\n                value: {\n                  fees: {\n                    id: {\n                      type: 'Concrete',\n                      value: {\n                        parents: 1,\n                        interior: { type: 'Here', value: undefined }\n                      }\n                    },\n                    fun: { type: 'Fungible', value: 120000000000n }\n                  },\n                  weight_limit: { type: 'Unlimited', value: undefined }\n                }\n              },\n              {\n                type: 'DepositAsset',\n                value: {\n                  assets: { type: 'Wild', value: { type: 'AllCounted', value: 1 } },\n                  beneficiary: {\n                    parents: 0,\n                    interior: {\n                      type: 'X1',\n                      value: {\n                        type: 'AccountId32',\n                        value: {\n                          network: undefined,\n                          id: FixedSizeBinary {\n                            asText: [Function (anonymous)],\n                            asHex: [Function (anonymous)],\n                            asOpaqueHex: [Function (anonymous)],\n                            asBytes: [Function (anonymous)],\n                            asOpaqueBytes: [Function (anonymous)]\n                          }\n                        }\n                      }\n                    }\n                  }\n                }\n              },\n              {\n                type: 'SetTopic',\n                value: FixedSizeBinary {\n                  asText: [Function (anonymous)],\n                  asHex: [Function (anonymous)],\n                  asOpaqueHex: [Function (anonymous)],\n                  asBytes: [Function (anonymous)],\n                  asOpaqueBytes: [Function (anonymous)]\n                }\n              }\n            ]\n          }\n        }      \n      
\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 3, "depth": 3, "title": "Dry Run XCM", "anchor": "dry-run-xcm", "start_char": 10429, "end_char": 16835, "estimated_token_count": 1176, "token_estimator": "heuristic-v1", "text": "### Dry Run XCM\n\nThis API allows the direct dry-run of an xcm message instead of an extrinsic one, checks if it will execute successfully, and determines what other xcm messages will be forwarded to other chains.\n\n```rust\nfn dry_run_xcm(origin_location: VersionedLocation, xcm: VersionedXcm) -> Result, Error>;\n```\n\n??? interface \"Input parameters\"\n\n `origin_location` ++\"VersionedLocation\"++ ++\"required\"++\n\n The location of the origin that will execute the xcm message.\n\n ---\n\n `xcm` ++\"VersionedXcm\"++ ++\"required\"++\n\n A versioned XCM message.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n Effects of dry-running an extrinsic. If an error occurs, it is returned instead of the effects.\n\n ??? child \"Type `XcmDryRunEffects`\"\n\n `execution_result` ++\"DispatchResultWithPostInfo\"++\n\n The result of executing the extrinsic.\n\n ---\n\n `emitted_events` ++\"Vec\"++\n\n The list of events fired by the extrinsic.\n\n ---\n\n `forwarded_xcms` ++\"Vec<(VersionedLocation, Vec>)>\"++\n\n The list of XCMs that were queued for sending.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to simulate a [teleport asset transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#asset-teleportation){target=\\_blank} from the Paseo network to the Paseo Asset Hub parachain. The code shows how to test and verify the received XCM message's behavior in the destination chain through a dry run on the live network.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseoAssetHub,\n XcmVersionedLocation,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to Paseo Asset Hub\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the origin\n const origin = XcmVersionedLocation.V3({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n });\n\n // Define a xcm message comming from the Paseo relay chain to Asset Hub to Teleport some tokens\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute dry run xcm\n const dryRunResult = await paseoAssetHubApi.apis.DryRunApi.dry_run_xcm(\n origin,\n xcm,\n );\n\n // Print the results\n console.dir(dryRunResult.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          execution_result: {\n            type: 'Complete',\n            value: { used: { ref_time: 15574200000n, proof_size: 359300n } }\n          },\n          emitted_events: [\n            {\n              type: 'System',\n              value: {\n                type: 'NewAccount',\n                value: { account: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET' }\n              }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Endowed',\n                value: {\n                  account: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',\n                  free_balance: 10203500000n\n                }\n              }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Minted',\n                value: {\n                  who: '12pGtwHPL4tUAUcyeCoJ783NKRspztpWmXv4uxYRwiEnYNET',\n                  amount: 10203500000n\n                }\n              }\n            },\n            {\n              type: 'Balances',\n              value: { type: 'Issued', value: { amount: 1796500000n } }\n            },\n            {\n              type: 'Balances',\n              value: {\n                type: 'Deposit',\n                value: {\n                  who: '13UVJyLgBASGhE2ok3TvxUfaQBGUt88JCcdYjHvUhvQkFTTx',\n                  amount: 1796500000n\n                }\n              }\n            }\n          ],\n          forwarded_xcms: [\n            [\n              {\n                type: 'V4',\n                value: { parents: 1, interior: { type: 'Here', value: undefined } }\n              },\n              []\n            ]\n          ]\n        }\n      
\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 4, "depth": 2, "title": "XCM Payment API", "anchor": "xcm-payment-api", "start_char": 16835, "end_char": 17921, "estimated_token_count": 222, "token_estimator": "heuristic-v1", "text": "## XCM Payment API\n\nThe [XCM Payment API](https://paritytech.github.io/polkadot-sdk/master/xcm_runtime_apis/fees/trait.XcmPaymentApi.html){target=\\_blank} provides a standardized way to determine the costs and payment options for executing XCM messages. Specifically, it enables clients to:\n\n- Retrieve the [weight](/polkadot-protocol/glossary/#weight) required to execute an XCM message.\n- Obtain a list of acceptable `AssetIds` for paying execution fees.\n- Calculate the cost of the weight in a specified `AssetId`.\n- Estimate the fees for XCM message delivery.\n\nThis API eliminates the need for clients to guess execution fees or identify acceptable assets manually. Instead, clients can query the list of supported asset IDs formatted according to the XCM version they understand. With this information, they can weigh the XCM program they intend to execute and convert the computed weight into its cost using one of the acceptable assets.\n\nTo use the API effectively, the client must already know the XCM program to be executed and the chains involved in the program's execution."} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 5, "depth": 3, "title": "Query Acceptable Payment Assets", "anchor": "query-acceptable-payment-assets", "start_char": 17921, "end_char": 20567, "estimated_token_count": 582, "token_estimator": "heuristic-v1", "text": "### Query Acceptable Payment Assets\n\nRetrieves the list of assets that are acceptable for paying fees when using a specific XCM version\n\n```rust\nfn query_acceptable_payment_assets(xcm_version: Version) -> Result, Error>;\n```\n\n??? interface \"Input parameters\"\n\n `xcm_version` ++\"Version\"++ ++\"required\"++\n\n Specifies the XCM version that will be used to send the XCM message.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result, Error>\"++\n\n A list of acceptable payment assets. Each asset is provided in a versioned format (`VersionedAssetId`) that matches the specified XCM version. If an error occurs, it is returned instead of the asset list.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to query the acceptable payment assets for executing XCM messages on the Paseo Asset Hub network using XCM version 3.\n\n ***Usage with PAPI***\n\n ```js\n import { paseoAssetHub } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n\n // Connect to the polkadot relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n // Define the xcm version to use\n const xcmVersion = 3;\n\n // Execute the runtime call to query the assets\n const result =\n await paseoAssetHubApi.apis.XcmPaymentApi.query_acceptable_payment_assets(\n xcmVersion,\n );\n\n // Print the assets\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        [\n          {\n            type: 'V3',\n            value: {\n              type: 'Concrete',\n              value: { parents: 1, interior: { type: 'Here', value: undefined } }\n            }\n          }\n        ]\n      
\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 6, "depth": 3, "title": "Query XCM Weight", "anchor": "query-xcm-weight", "start_char": 20567, "end_char": 25168, "estimated_token_count": 922, "token_estimator": "heuristic-v1", "text": "### Query XCM Weight\n\nCalculates the weight required to execute a given XCM message. It is useful for estimating the execution cost of a cross-chain message in the destination chain before sending it.\n\n```rust\nfn query_xcm_weight(message: VersionedXcm<()>) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `message` ++\"VersionedXcm<()>\"++ ++\"required\"++\n \n A versioned XCM message whose execution weight is being queried.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The calculated weight required to execute the provided XCM message. If the calculation fails, an error is returned instead.\n\n ??? child \"Type `Weight`\"\n\n `ref_time` ++\"u64\"++\n\n The weight of computational time used based on some reference hardware.\n\n ---\n\n `proof_size` ++\"u64\"++\n\n The weight of storage space used by proof of validity.\n\n ---\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to calculate the weight needed to execute a [teleport transfer](https://wiki.polkadot.com/learn/learn-xcm-usecases/#asset-teleportation){target=\\_blank} from the Paseo network to the Paseo Asset Hub parachain using the XCM Payment API. The result shows the required weight in terms of reference time and proof size needed in the destination chain.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseoAssetHub,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n // Connect to Paseo Asset Hub\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define a xcm message comming from the Paseo relay chain to Asset Hub to Teleport some tokens\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute the query weight runtime call\n const result = await paseoAssetHubApi.apis.XcmPaymentApi.query_xcm_weight(xcm);\n\n // Print the results\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n { ref_time: 15574200000n, proof_size: 359300n }\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 7, "depth": 3, "title": "Query Weight to Asset Fee", "anchor": "query-weight-to-asset-fee", "start_char": 25168, "end_char": 28210, "estimated_token_count": 699, "token_estimator": "heuristic-v1", "text": "### Query Weight to Asset Fee\n\nConverts a given weight into the corresponding fee for a specified `AssetId`. It allows clients to determine the cost of execution in terms of the desired asset.\n\n```rust\nfn query_weight_to_asset_fee(weight: Weight, asset: VersionedAssetId) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `weight` ++\"Weight\"++ ++\"required\"++\n \n The execution weight to be converted into a fee.\n\n ??? child \"Type `Weight`\"\n\n `ref_time` ++\"u64\"++\n\n The weight of computational time used based on some reference hardware.\n\n ---\n\n `proof_size` ++\"u64\"++\n\n The weight of storage space used by proof of validity.\n\n ---\n\n ---\n\n `asset` ++\"VersionedAssetId\"++ ++\"required\"++\n \n The asset in which the fee will be calculated. This must be a versioned asset ID compatible with the runtime.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The fee needed to pay for the execution for the given `AssetId.`\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to calculate the fee for a given execution weight using a specific versioned asset ID (PAS token) on Paseo Asset Hub.\n\n ***Usage with PAPI***\n\n ```js\n import { paseoAssetHub } from '@polkadot-api/descriptors';\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n\n // Connect to the polkadot relay chain\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://asset-hub-paseo-rpc.dwellir.com')),\n );\n\n const paseoAssetHubApi = client.getTypedApi(paseoAssetHub);\n\n // Define the weight to convert to fee\n const weight = { ref_time: 15574200000n, proof_size: 359300n };\n\n // Define the versioned asset id\n const versionedAssetId = {\n type: 'V4',\n value: { parents: 1, interior: { type: 'Here', value: undefined } },\n };\n\n // Execute the runtime call to convert the weight to fee\n const result =\n await paseoAssetHubApi.apis.XcmPaymentApi.query_weight_to_asset_fee(\n weight,\n versionedAssetId,\n );\n\n // Print the fee\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n 1796500000n\n
\n\n ---"} +{"page_id": "develop-interoperability-xcm-runtime-apis", "page_title": "XCM Runtime APIs", "index": 8, "depth": 3, "title": "Query Delivery Fees", "anchor": "query-delivery-fees", "start_char": 28210, "end_char": 33184, "estimated_token_count": 965, "token_estimator": "heuristic-v1", "text": "### Query Delivery Fees\n\nRetrieves the delivery fees for sending a specific XCM message to a designated destination. The fees are always returned in a specific asset defined by the destination chain.\n\n```rust\nfn query_delivery_fees(destination: VersionedLocation, message: VersionedXcm<()>) -> Result;\n```\n\n??? interface \"Input parameters\"\n\n `destination` ++\"VersionedLocation\"++ ++\"required\"++\n \n The target location where the message will be sent. Fees may vary depending on the destination, as different destinations often have unique fee structures and sender mechanisms.\n\n ---\n\n `message` ++\"VersionedXcm<()>\"++ ++\"required\"++\n \n The XCM message to be sent. The delivery fees are calculated based on the message's content and size, which can influence the cost.\n\n ---\n\n??? interface \"Output parameters\"\n\n ++\"Result\"++\n \n The calculated delivery fees expressed in a specific asset supported by the destination chain. If an error occurs during the query, it returns an error instead.\n\n ??? child \"Type `Error`\"\n\n Enum:\n\n - **`Unimplemented`**: An API part is unsupported.\n - **`VersionedConversionFailed`**: Converting a versioned data structure from one version to another failed.\n - **`WeightNotComputable`**: XCM message weight calculation failed.\n - **`UnhandledXcmVersion`**: XCM version not able to be handled.\n - **`AssetNotFound`**: The given asset is not handled as a fee asset.\n - **`Unroutable`**: Destination is known to be unroutable.\n\n ---\n\n??? interface \"Example\"\n\n This example demonstrates how to query the delivery fees for sending an XCM message from Paseo to Paseo Asset Hub.\n\n Replace `INSERT_USER_ADDRESS` with your SS58 address before running the script.\n\n ***Usage with PAPI***\n\n ```js\n import { createClient } from 'polkadot-api';\n import { getWsProvider } from 'polkadot-api/ws-provider/web';\n import { withPolkadotSdkCompat } from 'polkadot-api/polkadot-sdk-compat';\n import {\n XcmVersionedXcm,\n paseo,\n XcmVersionedLocation,\n XcmV3Junction,\n XcmV3Junctions,\n XcmV3WeightLimit,\n XcmV3MultiassetFungibility,\n XcmV3MultiassetAssetId,\n XcmV3Instruction,\n XcmV3MultiassetMultiAssetFilter,\n XcmV3MultiassetWildMultiAsset,\n } from '@polkadot-api/descriptors';\n import { Binary } from 'polkadot-api';\n import { ss58Decode } from '@polkadot-labs/hdkd-helpers';\n\n const client = createClient(\n withPolkadotSdkCompat(getWsProvider('wss://paseo-rpc.dwellir.com')),\n );\n\n const paseoApi = client.getTypedApi(paseo);\n\n const paseoAssetHubParaID = 1000;\n const userAddress = 'INSERT_USER_ADDRESS';\n const userPublicKey = ss58Decode(userAddress)[0];\n const idBeneficiary = Binary.fromBytes(userPublicKey);\n\n // Define the destination\n const destination = XcmVersionedLocation.V3({\n parents: 0,\n interior: XcmV3Junctions.X1(XcmV3Junction.Parachain(paseoAssetHubParaID)),\n });\n\n // Define the xcm message that will be sent to the destination\n const xcm = XcmVersionedXcm.V3([\n XcmV3Instruction.ReceiveTeleportedAsset([\n {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(12000000000n),\n },\n ]),\n XcmV3Instruction.ClearOrigin(),\n XcmV3Instruction.BuyExecution({\n fees: {\n id: XcmV3MultiassetAssetId.Concrete({\n parents: 1,\n interior: XcmV3Junctions.Here(),\n }),\n fun: XcmV3MultiassetFungibility.Fungible(BigInt(12000000000n)),\n },\n weight_limit: XcmV3WeightLimit.Unlimited(),\n }),\n XcmV3Instruction.DepositAsset({\n assets: XcmV3MultiassetMultiAssetFilter.Wild(\n XcmV3MultiassetWildMultiAsset.All(),\n ),\n beneficiary: {\n parents: 0,\n interior: XcmV3Junctions.X1(\n XcmV3Junction.AccountId32({\n network: undefined,\n id: idBeneficiary,\n }),\n ),\n },\n }),\n ]);\n\n // Execute the query delivery fees runtime call\n const result = await paseoApi.apis.XcmPaymentApi.query_delivery_fees(\n destination,\n xcm,\n );\n\n // Print the results\n console.dir(result.value, { depth: null });\n\n client.destroy();\n\n ```\n\n ***Output***\n\n
\n
\n        {\n          type: 'V3',\n          value: [\n            {\n              id: {\n                type: 'Concrete',\n                value: { parents: 0, interior: { type: 'Here', value: undefined } }\n              },\n              fun: { type: 'Fungible', value: 396000000n }\n            }\n          ]\n        }\n      
\n
\n\n ---"} {"page_id": "develop-interoperability", "page_title": "Interoperability", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 872, "end_char": 922, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "develop-interoperability", "page_title": "Interoperability", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 922, "end_char": 2331, "estimated_token_count": 378, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "develop-networks", "page_title": "Networks", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 12, "end_char": 513, "estimated_token_count": 77, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nThe Polkadot ecosystem consists of multiple networks designed to support different stages of blockchain development, from main networks to test networks. Each network serves a unique purpose, providing developers with flexible environments for building, testing, and deploying blockchain applications.\n\nThis section includes essential network information such as RPC endpoints, currency symbols and decimals, and how to acquire TestNet tokens for the Polkadot ecosystem of networks."} @@ -169,9 +169,9 @@ {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 3, "depth": 2, "title": "Working with the Docker Container", "anchor": "working-with-the-docker-container", "start_char": 2108, "end_char": 3426, "estimated_token_count": 337, "token_estimator": "heuristic-v1", "text": "## Working with the Docker Container\n\nThe [`srtool-cli`](https://github.com/chevdor/srtool-cli){target=\\_blank} package is a command-line utility written in Rust that installs an executable program called `srtool`. This program simplifies the interactions with the `srtool` Docker container.\n\nOver time, the tooling around the `srtool` Docker image has expanded to include the following tools and helper programs:\n\n- **[`srtool-cli`](https://github.com/chevdor/srtool-cli){target=\\_blank}**: Provides a command-line interface to pull the srtool Docker image, get information about the image and tooling used to interact with it, and build the runtime using the `srtool` Docker container.\n- **[`subwasm`](https://github.com/chevdor/subwasm){target=\\_blank}**: Provides command-line options for working with the metadata and Wasm runtime built using srtool. The `subwasm` program is also used internally to perform tasks in the `srtool` image.\n- **[`srtool-actions`](https://github.com/chevdor/srtool-actions){target=\\_blank}**: Provides GitHub actions to integrate builds produced using the `srtool` image with your GitHub CI/CD pipelines.\n- **[`srtool-app`](https://gitlab.com/chevdor/srtool-app){target=\\_blank}**: Provides a simple graphical user interface for building the runtime using the `srtool` Docker image."} {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 4, "depth": 2, "title": "Prepare the Environment", "anchor": "prepare-the-environment", "start_char": 3426, "end_char": 4413, "estimated_token_count": 235, "token_estimator": "heuristic-v1", "text": "## Prepare the Environment\n\nIt is recommended to install the `srtool-cli` program to work with the Docker image using a simple command-line interface.\n\nTo prepare the environment:\n\n1. Verify that Docker is installed by running the following command:\n\n ```bash\n docker --version\n ```\n\n If Docker is installed, the command will display version information:\n\n
\n docker --version\n Docker version 20.10.17, build 100c701\n
\n\n2. Install the `srtool` command-line interface by running the following command:\n\n ```bash\n cargo install --git https://github.com/chevdor/srtool-cli\n ```\n\n3. View usage information for the `srtool` command-line interface by running the following command:\n\n ```bash\n srtool help\n ```\n\n4. Download the latest `srtool` Docker image by running the following command:\n\n ```bash\n srtool pull\n ```"} {"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 5, "depth": 2, "title": "Start a Deterministic Build", "anchor": "start-a-deterministic-build", "start_char": 4413, "end_char": 5316, "estimated_token_count": 212, "token_estimator": "heuristic-v1", "text": "## Start a Deterministic Build\n\nAfter preparing the environment, the Wasm runtime can be compiled using the `srtool` Docker image.\n\nTo build the runtime, you need to open your Polkadot SDK-based project in a terminal shell and run the following command:\n\n```bash\nsrtool build --app --package INSERT_RUNTIME_PACKAGE_NAME --runtime-dir INSERT_RUNTIME_PATH \n```\n\n- The name specified for the `--package` should be the name defined in the `Cargo.toml` file for the runtime.\n- The path specified for the `--runtime-dir` should be the path to the `Cargo.toml` file for the runtime. For example:\n\n ```plain\n node/\n pallets/\n runtime/\n ├──lib.rs\n └──Cargo.toml # INSERT_RUNTIME_PATH should be the path to this file\n ...\n ```\n\n- If the `Cargo.toml` file for the runtime is located in a `runtime` subdirectory, for example, `runtime/kusama`, the `--runtime-dir` parameter can be omitted."} -{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 6, "depth": 2, "title": "Use srtool in GitHub Actions", "anchor": "use-srtool-in-github-actions", "start_char": 5316, "end_char": 6136, "estimated_token_count": 203, "token_estimator": "heuristic-v1", "text": "## Use srtool in GitHub Actions\n\nTo add a GitHub workflow for building the runtime:\n\n1. Create a `.github/workflows` directory in the chain's directory.\n2. In the `.github/workflows` directory, click **Add file**, then select **Create new file**.\n3. Copy the sample GitHub action from `basic.yml` example in the [`srtools-actions`](https://github.com/chevdor/srtool-actions){target=\\_blank} repository and paste it into the file you created in the previous step.\n\n ??? interface \"`basic.yml`\"\n\n {% raw %}\n ```yml\n \n ```\n {% endraw %}\n\n4. Modify the settings in the sample action.\n\n For example, modify the following settings:\n\n - The name of the chain.\n - The name of the runtime package.\n - The location of the runtime.\n\n5. Type a name for the action file and commit."} -{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 7, "depth": 2, "title": "Use the srtool Image via Docker Hub", "anchor": "use-the-srtool-image-via-docker-hub", "start_char": 6136, "end_char": 6926, "estimated_token_count": 215, "token_estimator": "heuristic-v1", "text": "## Use the srtool Image via Docker Hub\n\nIf utilizing [`srtool-cli`](https://github.com/chevdor/srtool-cli){target=\\_blank} or [`srtool-app`](https://gitlab.com/chevdor/srtool-app){target=\\_blank} isn't an option, the `paritytech/srtool` container image can be used directly via Docker Hub.\n\nTo pull the image from Docker Hub:\n\n1. Sign in to Docker Hub.\n2. Type `paritytech/srtool` in the search field and press enter.\n3. Click **paritytech/srtool**, then click **Tags**.\n4. Copy the command for the image you want to pull.\n5. Open a terminal shell on your local computer.\n6. Paste the command you copied from the Docker Hub. For example, you might run a command similar to the following, which downloads and unpacks the image:\n\n ```bash\n docker pull paritytech/srtool:1.88.0\n ```"} -{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 8, "depth": 3, "title": "Naming Convention for Images", "anchor": "naming-convention-for-images", "start_char": 6926, "end_char": 7622, "estimated_token_count": 156, "token_estimator": "heuristic-v1", "text": "### Naming Convention for Images\n\nKeep in mind that there is no `latest` tag for the `srtool` image. Ensure that the image selected is compatible with the locally available version of the Rust compiler.\n\nThe naming convention for `paritytech/srtool` Docker images specifies the version of the Rust compiler used to compile the code included in the image. Some images specify both a compiler version and the version of the build script used. For example, an image named `paritytech/srtool:1.62.0-0.9.19` was compiled with version `1.62.0` of the `rustc` compiler and version `0.9.19` of the build script. Images that only specify the compiler version always contain the software's latest version."} +{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 6, "depth": 2, "title": "Use srtool in GitHub Actions", "anchor": "use-srtool-in-github-actions", "start_char": 5316, "end_char": 6984, "estimated_token_count": 376, "token_estimator": "heuristic-v1", "text": "## Use srtool in GitHub Actions\n\nTo add a GitHub workflow for building the runtime:\n\n1. Create a `.github/workflows` directory in the chain's directory.\n2. In the `.github/workflows` directory, click **Add file**, then select **Create new file**.\n3. Copy the sample GitHub action from `basic.yml` example in the [`srtools-actions`](https://github.com/chevdor/srtool-actions){target=\\_blank} repository and paste it into the file you created in the previous step.\n\n ??? interface \"`basic.yml`\"\n\n {% raw %}\n ```yml\n name: Srtool build\n\n on: push\n\n jobs:\n srtool:\n runs-on: ubuntu-latest\n strategy:\n matrix:\n chain: [\"asset-hub-kusama\", \"asset-hub-westend\"]\n steps:\n - uses: actions/checkout@v3\n - name: Srtool build\n id: srtool_build\n uses: chevdor/srtool-actions@v0.8.0\n with:\n chain: ${{ matrix.chain }}\n runtime_dir: polkadot-parachains/${{ matrix.chain }}-runtime\n - name: Summary\n run: |\n echo '${{ steps.srtool_build.outputs.json }}' | jq . > ${{ matrix.chain }}-srtool-digest.json\n cat ${{ matrix.chain }}-srtool-digest.json\n echo \"Runtime location: ${{ steps.srtool_build.outputs.wasm }}\"\n ```\n {% endraw %}\n\n4. Modify the settings in the sample action.\n\n For example, modify the following settings:\n\n - The name of the chain.\n - The name of the runtime package.\n - The location of the runtime.\n\n5. Type a name for the action file and commit."} +{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 7, "depth": 2, "title": "Use the srtool Image via Docker Hub", "anchor": "use-the-srtool-image-via-docker-hub", "start_char": 6984, "end_char": 7774, "estimated_token_count": 215, "token_estimator": "heuristic-v1", "text": "## Use the srtool Image via Docker Hub\n\nIf utilizing [`srtool-cli`](https://github.com/chevdor/srtool-cli){target=\\_blank} or [`srtool-app`](https://gitlab.com/chevdor/srtool-app){target=\\_blank} isn't an option, the `paritytech/srtool` container image can be used directly via Docker Hub.\n\nTo pull the image from Docker Hub:\n\n1. Sign in to Docker Hub.\n2. Type `paritytech/srtool` in the search field and press enter.\n3. Click **paritytech/srtool**, then click **Tags**.\n4. Copy the command for the image you want to pull.\n5. Open a terminal shell on your local computer.\n6. Paste the command you copied from the Docker Hub. For example, you might run a command similar to the following, which downloads and unpacks the image:\n\n ```bash\n docker pull paritytech/srtool:1.88.0\n ```"} +{"page_id": "develop-parachains-deployment-build-deterministic-runtime", "page_title": "Build a deterministic runtime", "index": 8, "depth": 3, "title": "Naming Convention for Images", "anchor": "naming-convention-for-images", "start_char": 7774, "end_char": 8470, "estimated_token_count": 156, "token_estimator": "heuristic-v1", "text": "### Naming Convention for Images\n\nKeep in mind that there is no `latest` tag for the `srtool` image. Ensure that the image selected is compatible with the locally available version of the Rust compiler.\n\nThe naming convention for `paritytech/srtool` Docker images specifies the version of the Rust compiler used to compile the code included in the image. Some images specify both a compiler version and the version of the build script used. For example, an image named `paritytech/srtool:1.62.0-0.9.19` was compiled with version `1.62.0` of the `rustc` compiler and version `0.9.19` of the build script. Images that only specify the compiler version always contain the software's latest version."} {"page_id": "develop-parachains-deployment-coretime-renewal", "page_title": "Coretime Renewal", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 20, "end_char": 454, "estimated_token_count": 75, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nCoretime can be purchased in bulk for a period of 28 days, providing access to Polkadot's shared security and interoperability for Polkadot parachains. The bulk purchase of coretime includes a rent-control mechanism that keeps future purchases within a predictable price range of the initial purchase. This allows cores to be renewed at a known price without competing against other participants in the open market."} {"page_id": "develop-parachains-deployment-coretime-renewal", "page_title": "Coretime Renewal", "index": 1, "depth": 2, "title": "Bulk Sale Phases", "anchor": "bulk-sale-phases", "start_char": 454, "end_char": 1474, "estimated_token_count": 218, "token_estimator": "heuristic-v1", "text": "## Bulk Sale Phases\n\nThe bulk sale process consists of three distinct phases:\n\n1. **Interlude phase**: The period between bulk sales when renewals are prioritized.\n2. **Lead-in phase**: Following the interlude phase, a new `start_price` is set, and a Dutch auction begins, lasting for `leadin_length` blocks. During this phase, prices experience downward pressure as the system aims to find market equilibrium. The final price at the end of this phase becomes the `regular_price`, which will be used in the subsequent fixed price phase.\n3. **Fixed price phase**: The final phase where remaining cores are sold at the `regular_price` established during the lead-in phase. This provides a stable and predictable pricing environment for participants who did not purchase during the price discovery period.\n\nFor more comprehensive information about the coretime sales process, refer to the [Coretime Sales](https://wiki.polkadot.com/learn/learn-agile-coretime/#coretime-sales){target=\\_blank} section in the Polkadot Wiki."} {"page_id": "develop-parachains-deployment-coretime-renewal", "page_title": "Coretime Renewal", "index": 2, "depth": 2, "title": "Renewal Timing", "anchor": "renewal-timing", "start_char": 1474, "end_char": 2107, "estimated_token_count": 118, "token_estimator": "heuristic-v1", "text": "## Renewal Timing\n\nWhile renewals can technically be made during any phase, it is strongly recommended that they be completed during the interlude phase. Delaying renewal introduces the risk that the core could be sold to another market participant, preventing successful renewal. Renewals must be initiated well in advance to avoid the scenario above. \n\nFor example, if you purchase a core in bulk sale #1, you obtain coretime for the upcoming bulk period (during which bulk sale #2 takes place).\nYour renewal must be completed during bulk sale #2, ideally during its interlude phase, to secure coretime for the subsequent period."} @@ -252,11 +252,11 @@ {"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 2, "depth": 2, "title": "Implement Storage Migrations", "anchor": "implement-storage-migrations", "start_char": 4349, "end_char": 4975, "estimated_token_count": 155, "token_estimator": "heuristic-v1", "text": "## Implement Storage Migrations\n\nThe [`OnRuntimeUpgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.OnRuntimeUpgrade.html){target=\\_blank} trait provides the foundation for implementing storage migrations in your runtime. Here's a detailed look at its essential functions:\n\n```rust\npub trait OnRuntimeUpgrade {\n fn on_runtime_upgrade() -> Weight { ... }\n fn try_on_runtime_upgrade(checks: bool) -> Result { ... }\n fn pre_upgrade() -> Result, TryRuntimeError> { ... }\n fn post_upgrade(_state: Vec) -> Result<(), TryRuntimeError> { ... }\n}\n```"} {"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 3, "depth": 3, "title": "Core Migration Function", "anchor": "core-migration-function", "start_char": 4975, "end_char": 6007, "estimated_token_count": 216, "token_estimator": "heuristic-v1", "text": "### Core Migration Function\n\nThe [`on_runtime_upgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.Hooks.html#method.on_runtime_upgrade){target=\\_blank} function executes when the FRAME Executive pallet detects a runtime upgrade. Important considerations when using this function include:\n\n- It runs before any pallet's `on_initialize` hooks.\n- Critical storage items (like [`block_number`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.block_number){target=\\_blank}) may not be set.\n- Execution is mandatory and must be completed.\n- Careful weight calculation is required to prevent bricking the chain.\n\nWhen implementing the migration logic, your code must handle several vital responsibilities. A migration implementation must do the following to operate correctly:\n\n- Read existing storage values in their original format.\n- Transform data to match the new format.\n- Write updated values back to storage.\n- Calculate and return consumed weight."} {"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 4, "depth": 3, "title": "Migration Testing Hooks", "anchor": "migration-testing-hooks", "start_char": 6007, "end_char": 8023, "estimated_token_count": 399, "token_estimator": "heuristic-v1", "text": "### Migration Testing Hooks\n\nThe `OnRuntimeUpgrade` trait provides some functions designed specifically for testing migrations. These functions never execute on-chain but are essential for validating migration behavior in test environments. The migration test hooks are as follows:\n\n- **[`try_on_runtime_upgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.OnRuntimeUpgrade.html#method.try_on_runtime_upgrade){target=\\_blank}**: This function serves as the primary orchestrator for testing the complete migration process. It coordinates the execution flow from `pre-upgrade` checks through the actual migration to `post-upgrade` verification. Handling the entire migration sequence ensures that storage modifications occur correctly and in the proper order. Preserving this sequence is particularly valuable when testing multiple dependent migrations, where the execution order matters.\n\n- **[`pre_upgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.Hooks.html#method.pre_upgrade){target=\\_blank}**: Before a runtime upgrade begins, the `pre_upgrade` function performs preliminary checks and captures the current state. It returns encoded state data that can be used for `post-upgrade` verification. This function must never modify storage: it should only read and verify the existing state. The data it returns includes critical state values that should remain consistent or transform predictably during migration.\n\n- **[`post_upgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.Hooks.html#method.post_upgrade){target=\\_blank}**: After the migration completes, `post_upgrade` validates its success. It receives the state data captured by `pre_upgrade` to verify that the migration was executed correctly. This function checks for storage consistency and ensures all data transformations are completed as expected. Like `pre_upgrade`, it operates exclusively in testing environments and should not modify storage."} -{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 5, "depth": 3, "title": "Migration Structure", "anchor": "migration-structure", "start_char": 8023, "end_char": 10526, "estimated_token_count": 559, "token_estimator": "heuristic-v1", "text": "### Migration Structure\n\nThere are two approaches to implementing storage migrations. The first method involves directly implementing `OnRuntimeUpgrade` on structs. This approach requires manually checking the on-chain storage version against the new [`StorageVersion`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/struct.StorageVersion.html){target=\\_blank} and executing the transformation logic only when the check passes. This version verification prevents multiple executions of the migration during subsequent runtime upgrades.\n\nThe recommended approach is to implement [`UncheckedOnRuntimeUpgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.UncheckedOnRuntimeUpgrade.html){target=\\_blank} and wrap it with [`VersionedMigration`](https://paritytech.github.io/polkadot-sdk/master/frame_support/migrations/struct.VersionedMigration.html){target=\\_blank}. `VersionedMigration` implements `OnRuntimeUpgrade` and handles storage version management automatically, following best practices and reducing potential errors.\n\n`VersionedMigration` requires five type parameters:\n\n- **`From`**: The source version for the upgrade.\n- **`To`**: The target version for the upgrade.\n- **`Inner`**: The `UncheckedOnRuntimeUpgrade` implementation.\n- **`Pallet`**: The pallet being upgraded.\n- **`Weight`**: The runtime's [`RuntimeDbWeight`](https://paritytech.github.io/polkadot-sdk/master/frame_support/weights/struct.RuntimeDbWeight.html){target=\\_blank} implementation.\n\nExamine the following migration example that transforms a simple `StorageValue` storing a `u32` into a more complex structure that tracks both current and previous values using the `CurrentAndPreviousValue` struct:\n\n- Old `StorageValue` format:\n\n ```rust\n #[pallet::storage]\n pub type Value = StorageValue<_, u32>;\n ```\n\n- New `StorageValue` format:\n\n ```rust\n /// Example struct holding the most recently set [`u32`] and the\n /// second most recently set [`u32`] (if one existed).\n #[docify::export]\n #[derive(\n \tClone, Eq, PartialEq, Encode, Decode, RuntimeDebug, scale_info::TypeInfo, MaxEncodedLen,\n )]\n pub struct CurrentAndPreviousValue {\n \t/// The most recently set value.\n \tpub current: u32,\n \t/// The previous value, if one existed.\n \tpub previous: Option,\n }\n #[pallet::storage]\n \tpub type Value = StorageValue<_, CurrentAndPreviousValue>;\n ```\n\n- Migration:\n\n ```rust\n \n ```"} -{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 6, "depth": 3, "title": "Migration Organization", "anchor": "migration-organization", "start_char": 10526, "end_char": 11218, "estimated_token_count": 148, "token_estimator": "heuristic-v1", "text": "### Migration Organization\n\nBest practices recommend organizing migrations in a separate module within your pallet. Here's the recommended file structure:\n\n```plain\nmy-pallet/\n├── src/\n│ ├── lib.rs # Main pallet implementation\n│ └── migrations/ # All migration-related code\n│ ├── mod.rs # Migrations module definition\n│ ├── v1.rs # V0 -> V1 migration\n│ └── v2.rs # V1 -> V2 migration\n└── Cargo.toml\n```\n\nThis structure provides several benefits:\n\n- Separates migration logic from core pallet functionality.\n- Makes migrations easier to test and maintain.\n- Provides explicit versioning of storage changes.\n- Simplifies the addition of future migrations."} -{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 7, "depth": 3, "title": "Scheduling Migrations", "anchor": "scheduling-migrations", "start_char": 11218, "end_char": 11793, "estimated_token_count": 119, "token_estimator": "heuristic-v1", "text": "### Scheduling Migrations\n\nTo execute migrations during a runtime upgrade, you must configure them in your runtime's Executive pallet. Add your migrations in `runtime/src/lib.rs`:\n\n```rust\n/// Tuple of migrations (structs that implement `OnRuntimeUpgrade`)\ntype Migrations = (\n pallet_my_pallet::migrations::v1::Migration,\n // More migrations can be added here\n);\npub type Executive = frame_executive::Executive<\n Runtime,\n Block,\n frame_system::ChainContext,\n Runtime,\n AllPalletsWithSystem,\n Migrations, // Include migrations here\n>;\n\n```"} -{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 8, "depth": 2, "title": "Single-Block Migrations", "anchor": "single-block-migrations", "start_char": 11793, "end_char": 12881, "estimated_token_count": 196, "token_estimator": "heuristic-v1", "text": "## Single-Block Migrations\n\nSingle-block migrations execute their logic within one block immediately following a runtime upgrade. They run as part of the runtime upgrade process through the `OnRuntimeUpgrade` trait implementation and must be completed before any other runtime logic executes.\n\nWhile single-block migrations are straightforward to implement and provide immediate data transformation, they carry significant risks. The most critical consideration is that they must complete within one block's weight limits. This is especially crucial for parachains, where exceeding block weight limits will brick the chain.\n\nUse single-block migrations only when you can guarantee:\n\n- The migration has a bounded execution time.\n- Weight calculations are thoroughly tested.\n- Total weight will never exceed block limits.\n\nFor a complete implementation example of a single-block migration, refer to the [single-block migration example]( https://paritytech.github.io/polkadot-sdk/master/pallet_example_single_block_migrations/index.html){target=\\_blank} in the Polkadot SDK documentation."} -{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 9, "depth": 2, "title": "Multi Block Migrations", "anchor": "multi-block-migrations", "start_char": 12881, "end_char": 14162, "estimated_token_count": 230, "token_estimator": "heuristic-v1", "text": "## Multi Block Migrations\n\nMulti-block migrations distribute the migration workload across multiple blocks, providing a safer approach for production environments. The migration state is tracked in storage, allowing the process to pause and resume across blocks.\n\nThis approach is essential for production networks and parachains as the risk of exceeding block weight limits is eliminated. Multi-block migrations can safely handle large storage collections, unbounded data structures, and complex nested data types where weight consumption might be unpredictable.\n\nMulti-block migrations are ideal when dealing with:\n\n- Large-scale storage migrations.\n- Unbounded storage items or collections.\n- Complex data structures with uncertain weight costs.\n\nThe primary trade-off is increased implementation complexity, as you must manage the migration state and handle partial completion scenarios. However, multi-block migrations' significant safety benefits and operational reliability are typically worth the increased complexity.\n\nFor a complete implementation example of multi-block migrations, refer to the [official example](https://github.com/paritytech/polkadot-sdk/tree/polkadot-stable2506-2/substrate/frame/examples/multi-block-migrations){target=\\_blank} in the Polkadot SDK."} +{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 5, "depth": 3, "title": "Migration Structure", "anchor": "migration-structure", "start_char": 8023, "end_char": 14864, "estimated_token_count": 1637, "token_estimator": "heuristic-v1", "text": "### Migration Structure\n\nThere are two approaches to implementing storage migrations. The first method involves directly implementing `OnRuntimeUpgrade` on structs. This approach requires manually checking the on-chain storage version against the new [`StorageVersion`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/struct.StorageVersion.html){target=\\_blank} and executing the transformation logic only when the check passes. This version verification prevents multiple executions of the migration during subsequent runtime upgrades.\n\nThe recommended approach is to implement [`UncheckedOnRuntimeUpgrade`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/trait.UncheckedOnRuntimeUpgrade.html){target=\\_blank} and wrap it with [`VersionedMigration`](https://paritytech.github.io/polkadot-sdk/master/frame_support/migrations/struct.VersionedMigration.html){target=\\_blank}. `VersionedMigration` implements `OnRuntimeUpgrade` and handles storage version management automatically, following best practices and reducing potential errors.\n\n`VersionedMigration` requires five type parameters:\n\n- **`From`**: The source version for the upgrade.\n- **`To`**: The target version for the upgrade.\n- **`Inner`**: The `UncheckedOnRuntimeUpgrade` implementation.\n- **`Pallet`**: The pallet being upgraded.\n- **`Weight`**: The runtime's [`RuntimeDbWeight`](https://paritytech.github.io/polkadot-sdk/master/frame_support/weights/struct.RuntimeDbWeight.html){target=\\_blank} implementation.\n\nExamine the following migration example that transforms a simple `StorageValue` storing a `u32` into a more complex structure that tracks both current and previous values using the `CurrentAndPreviousValue` struct:\n\n- Old `StorageValue` format:\n\n ```rust\n #[pallet::storage]\n pub type Value = StorageValue<_, u32>;\n ```\n\n- New `StorageValue` format:\n\n ```rust\n /// Example struct holding the most recently set [`u32`] and the\n /// second most recently set [`u32`] (if one existed).\n #[docify::export]\n #[derive(\n \tClone, Eq, PartialEq, Encode, Decode, RuntimeDebug, scale_info::TypeInfo, MaxEncodedLen,\n )]\n pub struct CurrentAndPreviousValue {\n \t/// The most recently set value.\n \tpub current: u32,\n \t/// The previous value, if one existed.\n \tpub previous: Option,\n }\n #[pallet::storage]\n \tpub type Value = StorageValue<_, CurrentAndPreviousValue>;\n ```\n\n- Migration:\n\n ```rust\n use frame_support::{\n \tstorage_alias,\n \ttraits::{Get, UncheckedOnRuntimeUpgrade},\n };\n\n #[cfg(feature = \"try-runtime\")]\n use alloc::vec::Vec;\n\n /// Collection of storage item formats from the previous storage version.\n ///\n /// Required so we can read values in the v0 storage format during the migration.\n mod v0 {\n \tuse super::*;\n\n \t/// V0 type for [`crate::Value`].\n \t#[storage_alias]\n \tpub type Value = StorageValue, u32>;\n }\n\n /// Implements [`UncheckedOnRuntimeUpgrade`], migrating the state of this pallet from V0 to V1.\n ///\n /// In V0 of the template [`crate::Value`] is just a `u32`. In V1, it has been upgraded to\n /// contain the struct [`crate::CurrentAndPreviousValue`].\n ///\n /// In this migration, update the on-chain storage for the pallet to reflect the new storage\n /// layout.\n pub struct InnerMigrateV0ToV1(core::marker::PhantomData);\n\n impl UncheckedOnRuntimeUpgrade for InnerMigrateV0ToV1 {\n \t/// Return the existing [`crate::Value`] so we can check that it was correctly set in\n \t/// `InnerMigrateV0ToV1::post_upgrade`.\n \t#[cfg(feature = \"try-runtime\")]\n \tfn pre_upgrade() -> Result, sp_runtime::TryRuntimeError> {\n \t\tuse codec::Encode;\n\n \t\t// Access the old value using the `storage_alias` type\n \t\tlet old_value = v0::Value::::get();\n \t\t// Return it as an encoded `Vec`\n \t\tOk(old_value.encode())\n \t}\n\n \t/// Migrate the storage from V0 to V1.\n \t///\n \t/// - If the value doesn't exist, there is nothing to do.\n \t/// - If the value exists, it is read and then written back to storage inside a\n \t/// [`crate::CurrentAndPreviousValue`].\n \tfn on_runtime_upgrade() -> frame_support::weights::Weight {\n \t\t// Read the old value from storage\n \t\tif let Some(old_value) = v0::Value::::take() {\n \t\t\t// Write the new value to storage\n \t\t\tlet new = crate::CurrentAndPreviousValue { current: old_value, previous: None };\n \t\t\tcrate::Value::::put(new);\n \t\t\t// One read + write for taking the old value, and one write for setting the new value\n \t\t\tT::DbWeight::get().reads_writes(1, 2)\n \t\t} else {\n \t\t\t// No writes since there was no old value, just one read for checking\n \t\t\tT::DbWeight::get().reads(1)\n \t\t}\n \t}\n\n \t/// Verifies the storage was migrated correctly.\n \t///\n \t/// - If there was no old value, the new value should not be set.\n \t/// - If there was an old value, the new value should be a [`crate::CurrentAndPreviousValue`].\n \t#[cfg(feature = \"try-runtime\")]\n \tfn post_upgrade(state: Vec) -> Result<(), sp_runtime::TryRuntimeError> {\n \t\tuse codec::Decode;\n \t\tuse frame_support::ensure;\n\n \t\tlet maybe_old_value = Option::::decode(&mut &state[..]).map_err(|_| {\n \t\t\tsp_runtime::TryRuntimeError::Other(\"Failed to decode old value from storage\")\n \t\t})?;\n\n \t\tmatch maybe_old_value {\n \t\t\tSome(old_value) => {\n \t\t\t\tlet expected_new_value =\n \t\t\t\t\tcrate::CurrentAndPreviousValue { current: old_value, previous: None };\n \t\t\t\tlet actual_new_value = crate::Value::::get();\n\n \t\t\t\tensure!(actual_new_value.is_some(), \"New value not set\");\n \t\t\t\tensure!(\n \t\t\t\t\tactual_new_value == Some(expected_new_value),\n \t\t\t\t\t\"New value not set correctly\"\n \t\t\t\t);\n \t\t\t},\n \t\t\tNone => {\n \t\t\t\tensure!(crate::Value::::get().is_none(), \"New value unexpectedly set\");\n \t\t\t},\n \t\t};\n \t\tOk(())\n \t}\n }\n\n /// [`UncheckedOnRuntimeUpgrade`] implementation [`InnerMigrateV0ToV1`] wrapped in a\n /// [`VersionedMigration`](frame_support::migrations::VersionedMigration), which ensures that:\n /// - The migration only runs once when the on-chain storage version is 0\n /// - The on-chain storage version is updated to `1` after the migration executes\n /// - Reads/Writes from checking/settings the on-chain storage version are accounted for\n pub type MigrateV0ToV1 = frame_support::migrations::VersionedMigration<\n \t0, // The migration will only execute when the on-chain storage version is 0\n \t1, // The on-chain storage version will be set to 1 after the migration is complete\n \tInnerMigrateV0ToV1,\n \tcrate::pallet::Pallet,\n \t::DbWeight,\n >;\n ```"} +{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 6, "depth": 3, "title": "Migration Organization", "anchor": "migration-organization", "start_char": 14864, "end_char": 15556, "estimated_token_count": 148, "token_estimator": "heuristic-v1", "text": "### Migration Organization\n\nBest practices recommend organizing migrations in a separate module within your pallet. Here's the recommended file structure:\n\n```plain\nmy-pallet/\n├── src/\n│ ├── lib.rs # Main pallet implementation\n│ └── migrations/ # All migration-related code\n│ ├── mod.rs # Migrations module definition\n│ ├── v1.rs # V0 -> V1 migration\n│ └── v2.rs # V1 -> V2 migration\n└── Cargo.toml\n```\n\nThis structure provides several benefits:\n\n- Separates migration logic from core pallet functionality.\n- Makes migrations easier to test and maintain.\n- Provides explicit versioning of storage changes.\n- Simplifies the addition of future migrations."} +{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 7, "depth": 3, "title": "Scheduling Migrations", "anchor": "scheduling-migrations", "start_char": 15556, "end_char": 16131, "estimated_token_count": 119, "token_estimator": "heuristic-v1", "text": "### Scheduling Migrations\n\nTo execute migrations during a runtime upgrade, you must configure them in your runtime's Executive pallet. Add your migrations in `runtime/src/lib.rs`:\n\n```rust\n/// Tuple of migrations (structs that implement `OnRuntimeUpgrade`)\ntype Migrations = (\n pallet_my_pallet::migrations::v1::Migration,\n // More migrations can be added here\n);\npub type Executive = frame_executive::Executive<\n Runtime,\n Block,\n frame_system::ChainContext,\n Runtime,\n AllPalletsWithSystem,\n Migrations, // Include migrations here\n>;\n\n```"} +{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 8, "depth": 2, "title": "Single-Block Migrations", "anchor": "single-block-migrations", "start_char": 16131, "end_char": 17219, "estimated_token_count": 196, "token_estimator": "heuristic-v1", "text": "## Single-Block Migrations\n\nSingle-block migrations execute their logic within one block immediately following a runtime upgrade. They run as part of the runtime upgrade process through the `OnRuntimeUpgrade` trait implementation and must be completed before any other runtime logic executes.\n\nWhile single-block migrations are straightforward to implement and provide immediate data transformation, they carry significant risks. The most critical consideration is that they must complete within one block's weight limits. This is especially crucial for parachains, where exceeding block weight limits will brick the chain.\n\nUse single-block migrations only when you can guarantee:\n\n- The migration has a bounded execution time.\n- Weight calculations are thoroughly tested.\n- Total weight will never exceed block limits.\n\nFor a complete implementation example of a single-block migration, refer to the [single-block migration example]( https://paritytech.github.io/polkadot-sdk/master/pallet_example_single_block_migrations/index.html){target=\\_blank} in the Polkadot SDK documentation."} +{"page_id": "develop-parachains-maintenance-storage-migrations", "page_title": "Storage Migrations", "index": 9, "depth": 2, "title": "Multi Block Migrations", "anchor": "multi-block-migrations", "start_char": 17219, "end_char": 18500, "estimated_token_count": 230, "token_estimator": "heuristic-v1", "text": "## Multi Block Migrations\n\nMulti-block migrations distribute the migration workload across multiple blocks, providing a safer approach for production environments. The migration state is tracked in storage, allowing the process to pause and resume across blocks.\n\nThis approach is essential for production networks and parachains as the risk of exceeding block weight limits is eliminated. Multi-block migrations can safely handle large storage collections, unbounded data structures, and complex nested data types where weight consumption might be unpredictable.\n\nMulti-block migrations are ideal when dealing with:\n\n- Large-scale storage migrations.\n- Unbounded storage items or collections.\n- Complex data structures with uncertain weight costs.\n\nThe primary trade-off is increased implementation complexity, as you must manage the migration state and handle partial completion scenarios. However, multi-block migrations' significant safety benefits and operational reliability are typically worth the increased complexity.\n\nFor a complete implementation example of multi-block migrations, refer to the [official example](https://github.com/paritytech/polkadot-sdk/tree/polkadot-stable2506-2/substrate/frame/examples/multi-block-migrations){target=\\_blank} in the Polkadot SDK."} {"page_id": "develop-parachains-maintenance-unlock-parachain", "page_title": "Unlock a Parachain", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 22, "end_char": 1071, "estimated_token_count": 182, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nParachain locks are a critical security mechanism in the Polkadot ecosystem designed to maintain decentralization during the parachain lifecycle. These locks prevent potential centralization risks that could emerge during the early stages of parachain operation.\n\nThe locking system follows strict, well-defined conditions that distribute control across multiple authorities:\n\n- Relay chain governance has the authority to lock any parachain.\n- A parachain can lock its own lock.\n- Parachain managers have permission to lock the parachain.\n- Parachains are locked automatically when they successfully produce their first block.\n\nSimilarly, unlocking a parachain follows controlled procedures:\n\n- Relay chain governance retains the authority to unlock any parachain.\n- A parachain can unlock its own lock.\n\nThis document guides you through checking a parachain's lock status and safely executing the unlock procedure from a parachain using [XCM (Cross-Consensus Messaging)](/develop/interoperability/intro-to-xcm/){target=\\_blank}."} {"page_id": "develop-parachains-maintenance-unlock-parachain", "page_title": "Unlock a Parachain", "index": 1, "depth": 2, "title": "Check If the Parachain Is Locked", "anchor": "check-if-the-parachain-is-locked", "start_char": 1071, "end_char": 2100, "estimated_token_count": 262, "token_estimator": "heuristic-v1", "text": "## Check If the Parachain Is Locked\n\nBefore unlocking a parachain, you should verify its current lock status. This can be done through the Polkadot.js interface:\n\n1. In [Polkadot.js Apps](https://polkadot.js.org/apps/#/explorer){target=\\_blank}, connect to the relay chain, navigate to the **Developer** dropdown and select the **Chain State** option.\n\n2. Query the parachain locked status:\n 1. Select **`registrar`**.\n 2. Choose the **`paras`** option.\n 3. Input the parachain ID you want to check as a parameter (e.g. `2006`).\n 4. Click the **+** button to execute the query.\n 5. Check the status of the parachain lock.\n - **`manager`**: The account that has placed a deposit for registering this parachain.\n - **`deposit`**: The amount reserved by the `manager` account for the registration.\n - **`locked`**: Whether the parachain registration should be locked from being controlled by the manager.\n\n ![](/images/develop/parachains/maintenance/unlock-parachain/unlock-parachain-1.webp)"} {"page_id": "develop-parachains-maintenance-unlock-parachain", "page_title": "Unlock a Parachain", "index": 2, "depth": 2, "title": "How to Unlock a Parachain", "anchor": "how-to-unlock-a-parachain", "start_char": 2100, "end_char": 2755, "estimated_token_count": 121, "token_estimator": "heuristic-v1", "text": "## How to Unlock a Parachain\n\nUnlocking a parachain requires sending an XCM (Cross-Consensus Message) to the relay chain from the parachain itself, sending a message with Root origin, or this can be accomplished through the relay chain's governance mechanism, executing a root call.\n\nIf sending an XCM, the parachain origin must have proper authorization, typically from either the parachain's sudo pallet (if enabled) or its governance system.\n\nThis guide demonstrates the unlocking process using a parachain with the sudo pallet. For parachains using governance-based authorization instead, the process will require adjustments to how the XCM is sent."} @@ -270,10 +270,10 @@ {"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 2, "depth": 3, "title": "Benchmarking and Weight", "anchor": "benchmarking-and-weight", "start_char": 2015, "end_char": 3681, "estimated_token_count": 321, "token_estimator": "heuristic-v1", "text": "### Benchmarking and Weight \n\nIn Polkadot SDK-based chains, weight quantifies the computational effort needed to process transactions. This weight includes factors such as:\n\n- Computational complexity.\n- Storage complexity (proof size).\n- Database reads and writes.\n- Hardware specifications.\n\nBenchmarking uses real-world testing to simulate worst-case scenarios for extrinsics. The framework generates a linear model for weight calculation by running multiple iterations with varied parameters. These worst-case weights ensure blocks remain within execution limits, enabling the runtime to maintain throughput under varying loads. Excess fees can be refunded if a call uses fewer resources than expected, offering users a fair cost model.\n \nBecause weight is a generic unit of measurement based on computation time for a specific physical machine, the weight of any function can change based on the specifications of hardware used for benchmarking. By modeling the expected weight of each runtime function, the blockchain can calculate the number of transactions or system-level calls it can execute within a certain period.\n\nWithin FRAME, each function call that is dispatched must have a `#[pallet::weight]` annotation that can return the expected weight for the worst-case scenario execution of that function given its inputs:\n\n```rust hl_lines=\"2\"\n#[pallet::call_index(0)]\n#[pallet::weight(T::WeightInfo::do_something())]\npub fn do_something(origin: OriginFor) -> DispatchResultWithPostInfo { Ok(()) }\n```\n\nThe `WeightInfo` file is automatically generated during benchmarking. Based on these tests, this file provides accurate weights for each extrinsic."} {"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 3, "depth": 2, "title": "Benchmarking Process", "anchor": "benchmarking-process", "start_char": 3681, "end_char": 4224, "estimated_token_count": 98, "token_estimator": "heuristic-v1", "text": "## Benchmarking Process\n\nBenchmarking a pallet involves the following steps: \n\n1. Creating a `benchmarking.rs` file within your pallet's structure.\n2. Writing a benchmarking test for each extrinsic.\n3. Executing the benchmarking tool to calculate weights based on performance metrics.\n\nThe benchmarking tool runs multiple iterations to model worst-case execution times and determine the appropriate weight. By default, the benchmarking pipeline is deactivated. To activate it, compile your runtime with the `runtime-benchmarks` feature flag."} {"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 4, "depth": 3, "title": "Prepare Your Environment", "anchor": "prepare-your-environment", "start_char": 4224, "end_char": 5278, "estimated_token_count": 293, "token_estimator": "heuristic-v1", "text": "### Prepare Your Environment\n\nInstall the [`frame-omni-bencher`](https://crates.io/crates/frame-omni-bencher){target=\\_blank} command-line tool:\n\n```bash\ncargo install frame-omni-bencher\n```\n\nBefore writing benchmark tests, you need to ensure the `frame-benchmarking` crate is included in your pallet's `Cargo.toml` similar to the following:\n\n```toml title=\"Cargo.toml\"\nframe-benchmarking = { version = \"37.0.0\", default-features = false }\n```\n\nYou must also ensure that you add the `runtime-benchmarks` feature flag as follows under the `[features]` section of your pallet's `Cargo.toml`:\n\n```toml title=\"Cargo.toml\"\nruntime-benchmarks = [\n \"frame-benchmarking/runtime-benchmarks\",\n \"frame-support/runtime-benchmarks\",\n \"frame-system/runtime-benchmarks\",\n \"sp-runtime/runtime-benchmarks\",\n]\n```\n\nLastly, ensure that `frame-benchmarking` is included in `std = []`: \n\n```toml title=\"Cargo.toml\"\nstd = [\n # ...\n \"frame-benchmarking?/std\",\n # ...\n]\n```\n\nOnce complete, you have the required dependencies for writing benchmark tests for your pallet."} -{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 5, "depth": 3, "title": "Write Benchmark Tests", "anchor": "write-benchmark-tests", "start_char": 5278, "end_char": 6838, "estimated_token_count": 377, "token_estimator": "heuristic-v1", "text": "### Write Benchmark Tests\n\nCreate a `benchmarking.rs` file in your pallet's `src/`. Your directory structure should look similar to the following:\n\n```\nmy-pallet/\n├── src/\n│ ├── lib.rs # Main pallet implementation\n│ └── benchmarking.rs # Benchmarking\n└── Cargo.toml\n```\n\nWith the directory structure set, you can use the [`polkadot-sdk-parachain-template`](https://github.com/paritytech/polkadot-sdk-parachain-template/tree/master/pallets){target=\\_blank} to get started as follows:\n\n```rust title=\"benchmarking.rs (starter template)\"\n\n```\n\nIn your benchmarking tests, employ these best practices:\n\n- **Write custom testing functions**: The function `do_something` in the preceding example is a placeholder. Similar to writing unit tests, you must write custom functions to benchmark test your extrinsics. Access the mock runtime and use functions such as `whitelisted_caller()` to sign transactions and facilitate testing.\n- **Use the `#[extrinsic_call]` macro**: This macro is used when calling the extrinsic itself and is a required part of a benchmarking function. See the [`extrinsic_call`](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/v2/index.html#extrinsic_call-and-block){target=\\_blank} docs for more details.\n- **Validate extrinsic behavior**: The `assert_eq` expression ensures that the extrinsic is working properly within the benchmark context.\n\nAdd the `benchmarking` module to your pallet. In the pallet `lib.rs` file add the following:\n\n```rust\n#[cfg(feature = \"runtime-benchmarks\")]\nmod benchmarking;\n```"} -{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 6, "depth": 3, "title": "Add Benchmarks to Runtime", "anchor": "add-benchmarks-to-runtime", "start_char": 6838, "end_char": 8967, "estimated_token_count": 418, "token_estimator": "heuristic-v1", "text": "### Add Benchmarks to Runtime\n\nBefore running the benchmarking tool, you must integrate benchmarks with your runtime as follows:\n\n1. Navigate to your `runtime/src` directory and check if a `benchmarks.rs` file exists. If not, create one. This file will contain the macro that registers all pallets for benchmarking along with their respective configurations:\n\n ```rust title=\"benchmarks.rs\"\n frame_benchmarking::define_benchmarks!(\n [frame_system, SystemBench::]\n [pallet_parachain_template, TemplatePallet]\n [pallet_balances, Balances]\n [pallet_session, SessionBench::]\n [pallet_timestamp, Timestamp]\n [pallet_message_queue, MessageQueue]\n [pallet_sudo, Sudo]\n [pallet_collator_selection, CollatorSelection]\n [cumulus_pallet_parachain_system, ParachainSystem]\n [cumulus_pallet_xcmp_queue, XcmpQueue]\n );\n ```\n\n For example, to add a new pallet named `pallet_parachain_template` for benchmarking, include it in the macro as shown:\n ```rust title=\"benchmarks.rs\" hl_lines=\"3\"\n frame_benchmarking::define_benchmarks!(\n [frame_system, SystemBench::]\n [pallet_parachain_template, TemplatePallet]\n );\n ```\n\n !!!warning \"Updating `define_benchmarks!` macro is required\"\n Any pallet that needs to be benchmarked must be included in the [`define_benchmarks!`](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/macro.define_benchmarks.html){target=\\_blank} macro. The CLI will only be able to access and benchmark pallets that are registered here.\n\n2. Check your runtime's `lib.rs` file to ensure the `benchmarks` module is imported. The import should look like this:\n\n ```rust title=\"lib.rs\"\n #[cfg(feature = \"runtime-benchmarks\")]\n mod benchmarks;\n ```\n\n The `runtime-benchmarks` feature gate ensures benchmark tests are isolated from production runtime code.\n\n3. Enable runtime benchmarking for your pallet in `runtime/Cargo.toml`:\n\n ```toml\n runtime-benchmarks = [\n # ...\n \"pallet_parachain_template/runtime-benchmarks\",\n ]\n\n ```"} -{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 7, "depth": 3, "title": "Run Benchmarks", "anchor": "run-benchmarks", "start_char": 8967, "end_char": 13352, "estimated_token_count": 1100, "token_estimator": "heuristic-v1", "text": "### Run Benchmarks\n\nYou can now compile your runtime with the `runtime-benchmarks` feature flag. This feature flag is crucial as the benchmarking tool will look for this feature being enabled to know when it should run benchmark tests. Follow these steps to compile the runtime with benchmarking enabled:\n\n1. Run `build` with the feature flag included:\n\n ```bash\n cargo build --features runtime-benchmarks --release\n ```\n\n2. Create a `weights.rs` file in your pallet's `src/` directory. This file will store the auto-generated weight calculations:\n\n ```bash\n touch weights.rs\n ```\n\n3. Before running the benchmarking tool, you'll need a template file that defines how weight information should be formatted. Download the official template from the Polkadot SDK repository and save it in your project folders for future use:\n\n ```bash\n curl https://raw.githubusercontent.com/paritytech/polkadot-sdk/refs/tags/polkadot-stable2412/substrate/.maintain/frame-weight-template.hbs \\\n --output ./pallets/benchmarking/frame-weight-template.hbs\n ```\n\n4. Run the benchmarking tool to measure extrinsic weights:\n\n ```bash\n frame-omni-bencher v1 benchmark pallet \\\n --runtime INSERT_PATH_TO_WASM_RUNTIME \\\n --pallet INSERT_NAME_OF_PALLET \\\n --extrinsic \"\" \\\n --template ./frame-weight-template.hbs \\\n --output weights.rs\n ```\n\n !!! tip \"Flag definitions\"\n - **`--runtime`**: The path to your runtime's Wasm.\n - **`--pallet`**: The name of the pallet you wish to benchmark. This pallet must be configured in your runtime and defined in `define_benchmarks`.\n - **`--extrinsic`**: Which extrinsic to test. Using `\"\"` implies all extrinsics will be benchmarked.\n - **`--template`**: Defines how weight information should be formatted.\n - **`--output`**: Where the output of the auto-generated weights will reside.\n\nThe generated `weights.rs` file contains weight annotations for your extrinsics, ready to be added to your pallet. The output should be similar to the following. Some output is omitted for brevity:\n\n
\n frame-omni-bencher v1 benchmark pallet \\\n --runtime INSERT_PATH_TO_WASM_RUNTIME \\\n --pallet \"INSERT_NAME_OF_PALLET\" \\\n --extrinsic \"\" \\\n --template ./frame-weight-template.hbs \\\n --output ./weights.rs\n ...\n 2025-01-15T16:41:33.557045Z INFO polkadot_sdk_frame::benchmark::pallet: [ 0 % ] Starting benchmark: pallet_parachain_template::do_something\n 2025-01-15T16:41:33.564644Z INFO polkadot_sdk_frame::benchmark::pallet: [ 50 % ] Starting benchmark: pallet_parachain_template::cause_error\n ...\n Created file: \"weights.rs\"\n \n
\n\n#### Add Benchmark Weights to Pallet\n\nOnce the `weights.rs` is generated, you must integrate it with your pallet. \n\n1. To begin the integration, import the `weights` module and the `WeightInfo` trait, then add both to your pallet's `Config` trait. Complete the following steps to set up the configuration:\n\n ```rust title=\"lib.rs\"\n pub mod weights;\n use crate::weights::WeightInfo;\n\n /// Configure the pallet by specifying the parameters and types on which it depends.\n #[pallet::config]\n pub trait Config: frame_system::Config {\n // ...\n /// A type representing the weights required by the dispatchables of this pallet.\n type WeightInfo: WeightInfo;\n }\n ```\n\n2. Next, you must add this to the `#[pallet::weight]` annotation in all the extrinsics via the `Config` as follows:\n\n ```rust hl_lines=\"2\" title=\"lib.rs\"\n #[pallet::call_index(0)]\n #[pallet::weight(T::WeightInfo::do_something())]\n pub fn do_something(origin: OriginFor) -> DispatchResultWithPostInfo { Ok(()) }\n ```\n\n3. Finally, configure the actual weight values in your runtime. In `runtime/src/config/mod.rs`, add the following code:\n\n ```rust title=\"mod.rs\"\n // Configure pallet.\n impl pallet_parachain_template::Config for Runtime {\n // ...\n type WeightInfo = pallet_parachain_template::weights::SubstrateWeight;\n }\n ```"} -{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 8, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 13352, "end_char": 13835, "estimated_token_count": 114, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n- View the Rust Docs for a more comprehensive, low-level view of the [FRAME V2 Benchmarking Suite](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/v2/index.html){target=_blank}.\n- Read the [FRAME Benchmarking and Weights](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/reference_docs/frame_benchmarking_weight/index.html){target=_blank} reference document, a concise guide which details how weights and benchmarking work."} +{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 5, "depth": 3, "title": "Write Benchmark Tests", "anchor": "write-benchmark-tests", "start_char": 5278, "end_char": 7734, "estimated_token_count": 645, "token_estimator": "heuristic-v1", "text": "### Write Benchmark Tests\n\nCreate a `benchmarking.rs` file in your pallet's `src/`. Your directory structure should look similar to the following:\n\n```\nmy-pallet/\n├── src/\n│ ├── lib.rs # Main pallet implementation\n│ └── benchmarking.rs # Benchmarking\n└── Cargo.toml\n```\n\nWith the directory structure set, you can use the [`polkadot-sdk-parachain-template`](https://github.com/paritytech/polkadot-sdk-parachain-template/tree/master/pallets){target=\\_blank} to get started as follows:\n\n```rust title=\"benchmarking.rs (starter template)\"\n//! Benchmarking setup for pallet-template\n#![cfg(feature = \"runtime-benchmarks\")]\n\nuse super::*;\nuse frame_benchmarking::v2::*;\n\n#[benchmarks]\nmod benchmarks {\n\tuse super::*;\n\t#[cfg(test)]\n\tuse crate::pallet::Pallet as Template;\n\tuse frame_system::RawOrigin;\n\n\t#[benchmark]\n\tfn do_something() {\n\t\tlet caller: T::AccountId = whitelisted_caller();\n\t\t#[extrinsic_call]\n\t\tdo_something(RawOrigin::Signed(caller), 100);\n\n\t\tassert_eq!(Something::::get().map(|v| v.block_number), Some(100u32.into()));\n\t}\n\n\t#[benchmark]\n\tfn cause_error() {\n\t\tSomething::::put(CompositeStruct { block_number: 100u32.into() });\n\t\tlet caller: T::AccountId = whitelisted_caller();\n\t\t#[extrinsic_call]\n\t\tcause_error(RawOrigin::Signed(caller));\n\n\t\tassert_eq!(Something::::get().map(|v| v.block_number), Some(101u32.into()));\n\t}\n\n\timpl_benchmark_test_suite!(Template, crate::mock::new_test_ext(), crate::mock::Test);\n}\n```\n\nIn your benchmarking tests, employ these best practices:\n\n- **Write custom testing functions**: The function `do_something` in the preceding example is a placeholder. Similar to writing unit tests, you must write custom functions to benchmark test your extrinsics. Access the mock runtime and use functions such as `whitelisted_caller()` to sign transactions and facilitate testing.\n- **Use the `#[extrinsic_call]` macro**: This macro is used when calling the extrinsic itself and is a required part of a benchmarking function. See the [`extrinsic_call`](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/v2/index.html#extrinsic_call-and-block){target=\\_blank} docs for more details.\n- **Validate extrinsic behavior**: The `assert_eq` expression ensures that the extrinsic is working properly within the benchmark context.\n\nAdd the `benchmarking` module to your pallet. In the pallet `lib.rs` file add the following:\n\n```rust\n#[cfg(feature = \"runtime-benchmarks\")]\nmod benchmarking;\n```"} +{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 6, "depth": 3, "title": "Add Benchmarks to Runtime", "anchor": "add-benchmarks-to-runtime", "start_char": 7734, "end_char": 9863, "estimated_token_count": 418, "token_estimator": "heuristic-v1", "text": "### Add Benchmarks to Runtime\n\nBefore running the benchmarking tool, you must integrate benchmarks with your runtime as follows:\n\n1. Navigate to your `runtime/src` directory and check if a `benchmarks.rs` file exists. If not, create one. This file will contain the macro that registers all pallets for benchmarking along with their respective configurations:\n\n ```rust title=\"benchmarks.rs\"\n frame_benchmarking::define_benchmarks!(\n [frame_system, SystemBench::]\n [pallet_parachain_template, TemplatePallet]\n [pallet_balances, Balances]\n [pallet_session, SessionBench::]\n [pallet_timestamp, Timestamp]\n [pallet_message_queue, MessageQueue]\n [pallet_sudo, Sudo]\n [pallet_collator_selection, CollatorSelection]\n [cumulus_pallet_parachain_system, ParachainSystem]\n [cumulus_pallet_xcmp_queue, XcmpQueue]\n );\n ```\n\n For example, to add a new pallet named `pallet_parachain_template` for benchmarking, include it in the macro as shown:\n ```rust title=\"benchmarks.rs\" hl_lines=\"3\"\n frame_benchmarking::define_benchmarks!(\n [frame_system, SystemBench::]\n [pallet_parachain_template, TemplatePallet]\n );\n ```\n\n !!!warning \"Updating `define_benchmarks!` macro is required\"\n Any pallet that needs to be benchmarked must be included in the [`define_benchmarks!`](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/macro.define_benchmarks.html){target=\\_blank} macro. The CLI will only be able to access and benchmark pallets that are registered here.\n\n2. Check your runtime's `lib.rs` file to ensure the `benchmarks` module is imported. The import should look like this:\n\n ```rust title=\"lib.rs\"\n #[cfg(feature = \"runtime-benchmarks\")]\n mod benchmarks;\n ```\n\n The `runtime-benchmarks` feature gate ensures benchmark tests are isolated from production runtime code.\n\n3. Enable runtime benchmarking for your pallet in `runtime/Cargo.toml`:\n\n ```toml\n runtime-benchmarks = [\n # ...\n \"pallet_parachain_template/runtime-benchmarks\",\n ]\n\n ```"} +{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 7, "depth": 3, "title": "Run Benchmarks", "anchor": "run-benchmarks", "start_char": 9863, "end_char": 14248, "estimated_token_count": 1100, "token_estimator": "heuristic-v1", "text": "### Run Benchmarks\n\nYou can now compile your runtime with the `runtime-benchmarks` feature flag. This feature flag is crucial as the benchmarking tool will look for this feature being enabled to know when it should run benchmark tests. Follow these steps to compile the runtime with benchmarking enabled:\n\n1. Run `build` with the feature flag included:\n\n ```bash\n cargo build --features runtime-benchmarks --release\n ```\n\n2. Create a `weights.rs` file in your pallet's `src/` directory. This file will store the auto-generated weight calculations:\n\n ```bash\n touch weights.rs\n ```\n\n3. Before running the benchmarking tool, you'll need a template file that defines how weight information should be formatted. Download the official template from the Polkadot SDK repository and save it in your project folders for future use:\n\n ```bash\n curl https://raw.githubusercontent.com/paritytech/polkadot-sdk/refs/tags/polkadot-stable2412/substrate/.maintain/frame-weight-template.hbs \\\n --output ./pallets/benchmarking/frame-weight-template.hbs\n ```\n\n4. Run the benchmarking tool to measure extrinsic weights:\n\n ```bash\n frame-omni-bencher v1 benchmark pallet \\\n --runtime INSERT_PATH_TO_WASM_RUNTIME \\\n --pallet INSERT_NAME_OF_PALLET \\\n --extrinsic \"\" \\\n --template ./frame-weight-template.hbs \\\n --output weights.rs\n ```\n\n !!! tip \"Flag definitions\"\n - **`--runtime`**: The path to your runtime's Wasm.\n - **`--pallet`**: The name of the pallet you wish to benchmark. This pallet must be configured in your runtime and defined in `define_benchmarks`.\n - **`--extrinsic`**: Which extrinsic to test. Using `\"\"` implies all extrinsics will be benchmarked.\n - **`--template`**: Defines how weight information should be formatted.\n - **`--output`**: Where the output of the auto-generated weights will reside.\n\nThe generated `weights.rs` file contains weight annotations for your extrinsics, ready to be added to your pallet. The output should be similar to the following. Some output is omitted for brevity:\n\n
\n frame-omni-bencher v1 benchmark pallet \\\n --runtime INSERT_PATH_TO_WASM_RUNTIME \\\n --pallet \"INSERT_NAME_OF_PALLET\" \\\n --extrinsic \"\" \\\n --template ./frame-weight-template.hbs \\\n --output ./weights.rs\n ...\n 2025-01-15T16:41:33.557045Z INFO polkadot_sdk_frame::benchmark::pallet: [ 0 % ] Starting benchmark: pallet_parachain_template::do_something\n 2025-01-15T16:41:33.564644Z INFO polkadot_sdk_frame::benchmark::pallet: [ 50 % ] Starting benchmark: pallet_parachain_template::cause_error\n ...\n Created file: \"weights.rs\"\n \n
\n\n#### Add Benchmark Weights to Pallet\n\nOnce the `weights.rs` is generated, you must integrate it with your pallet. \n\n1. To begin the integration, import the `weights` module and the `WeightInfo` trait, then add both to your pallet's `Config` trait. Complete the following steps to set up the configuration:\n\n ```rust title=\"lib.rs\"\n pub mod weights;\n use crate::weights::WeightInfo;\n\n /// Configure the pallet by specifying the parameters and types on which it depends.\n #[pallet::config]\n pub trait Config: frame_system::Config {\n // ...\n /// A type representing the weights required by the dispatchables of this pallet.\n type WeightInfo: WeightInfo;\n }\n ```\n\n2. Next, you must add this to the `#[pallet::weight]` annotation in all the extrinsics via the `Config` as follows:\n\n ```rust hl_lines=\"2\" title=\"lib.rs\"\n #[pallet::call_index(0)]\n #[pallet::weight(T::WeightInfo::do_something())]\n pub fn do_something(origin: OriginFor) -> DispatchResultWithPostInfo { Ok(()) }\n ```\n\n3. Finally, configure the actual weight values in your runtime. In `runtime/src/config/mod.rs`, add the following code:\n\n ```rust title=\"mod.rs\"\n // Configure pallet.\n impl pallet_parachain_template::Config for Runtime {\n // ...\n type WeightInfo = pallet_parachain_template::weights::SubstrateWeight;\n }\n ```"} +{"page_id": "develop-parachains-testing-benchmarking", "page_title": "Benchmarking FRAME Pallets", "index": 8, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 14248, "end_char": 14731, "estimated_token_count": 114, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n- View the Rust Docs for a more comprehensive, low-level view of the [FRAME V2 Benchmarking Suite](https://paritytech.github.io/polkadot-sdk/master/frame_benchmarking/v2/index.html){target=_blank}.\n- Read the [FRAME Benchmarking and Weights](https://paritytech.github.io/polkadot-sdk/master/polkadot_sdk_docs/reference_docs/frame_benchmarking_weight/index.html){target=_blank} reference document, a concise guide which details how weights and benchmarking work."} {"page_id": "develop-parachains-testing-mock-runtime", "page_title": "Mock Runtime for Pallet Testing", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 16, "end_char": 474, "estimated_token_count": 78, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nTesting is essential in Polkadot SDK development to ensure your blockchain operates as intended and effectively handles various potential scenarios. This guide walks you through setting up an environment to test pallets within the [runtime](/polkadot-protocol/glossary#runtime){target=_blank}, allowing you to evaluate how different pallets, their configurations, and system components interact to ensure reliable blockchain functionality."} {"page_id": "develop-parachains-testing-mock-runtime", "page_title": "Mock Runtime for Pallet Testing", "index": 1, "depth": 2, "title": "Configuring a Mock Runtime", "anchor": "configuring-a-mock-runtime", "start_char": 474, "end_char": 505, "estimated_token_count": 6, "token_estimator": "heuristic-v1", "text": "## Configuring a Mock Runtime"} {"page_id": "develop-parachains-testing-mock-runtime", "page_title": "Mock Runtime for Pallet Testing", "index": 2, "depth": 3, "title": "Testing Module", "anchor": "testing-module", "start_char": 505, "end_char": 2264, "estimated_token_count": 348, "token_estimator": "heuristic-v1", "text": "### Testing Module\n\nThe mock runtime includes all the necessary pallets and configurations needed for testing. To ensure proper testing, you must create a module that integrates all components, enabling assessment of interactions between pallets and system elements.\n\nHere's a simple example of how to create a testing module that simulates these interactions:\n\n```rust\npub mod tests {\n use crate::*;\n // ...\n}\n```\n\nThe `crate::*;` snippet imports all the components from your crate (including runtime configurations, pallet modules, and utility functions) into the `tests` module. This allows you to write tests without manually importing each piece, making the code more concise and readable. You can opt to instead create a separate `mock.rs` file to define the configuration for your mock runtime and a companion `tests.rs` file to house the specific logic for each test.\n\nOnce the testing module is configured, you can craft your mock runtime using the [`frame_support::runtime`](https://paritytech.github.io/polkadot-sdk/master/frame_support/attr.runtime.html){target=\\_blank} macro. This macro allows you to define a runtime environment that will be created for testing purposes:\n\n```rust\npub mod tests {\n use crate::*;\n\n #[frame_support::runtime]\n mod runtime {\n #[runtime::runtime]\n #[runtime::derive(\n RuntimeCall,\n RuntimeEvent,\n RuntimeError,\n RuntimeOrigin,\n RuntimeFreezeReason,\n RuntimeHoldReason,\n RuntimeSlashReason,\n RuntimeLockId,\n RuntimeTask\n )]\n pub struct Test;\n\n #[runtime::pallet_index(0)]\n pub type System = frame_system::Pallet;\n\n // Other pallets...\n }\n}\n```"} @@ -465,15 +465,15 @@ {"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 199, "end_char": 721, "estimated_token_count": 78, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nPrecompiles offer Polkadot Hub developers access to high-performance native functions directly from their smart contracts. Each precompile has a specific address and accepts a particular input data format. When called correctly, they execute optimized, native implementations of commonly used functions much more efficiently than equivalent contract-based implementations.\n\nThis guide demonstrates how to interact with each standard precompile available in Polkadot Hub through Solidity smart contracts."} {"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 1, "depth": 2, "title": "Basic Precompile Interaction Pattern", "anchor": "basic-precompile-interaction-pattern", "start_char": 721, "end_char": 1661, "estimated_token_count": 194, "token_estimator": "heuristic-v1", "text": "## Basic Precompile Interaction Pattern\n\nAll precompiles follow a similar interaction pattern:\n\n```solidity\n// Generic pattern for calling precompiles\nfunction callPrecompile(address precompileAddress, bytes memory input)\n internal\n returns (bool success, bytes memory result)\n{\n // Direct low-level call to the precompile address\n (success, result) = precompileAddress.call(input);\n\n // Ensure the call was successful\n require(success, \"Precompile call failed\");\n\n return (success, result);\n}\n```\n\nFeel free to check the [`precompiles-hardhat`](https://github.com/polkadot-developers/polkavm-hardhat-examples/tree/v0.0.3/precompiles-hardhat){target=\\_blank} repository to check all the precompiles examples. The repository contains a set of example contracts and test files demonstrating how to interact with each precompile in Polkadot Hub.\n\nNow, you'll explore how to use each precompile available in Polkadot Hub."} {"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 2, "depth": 2, "title": "ECRecover (0x01)", "anchor": "ecrecover-0x01", "start_char": 1661, "end_char": 3161, "estimated_token_count": 325, "token_estimator": "heuristic-v1", "text": "## ECRecover (0x01)\n\nECRecover recovers an Ethereum address associated with the public key used to sign a message.\n\n```solidity title=\"ECRecover.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract ECRecoverExample {\n event ECRecovered(bytes result);\n\n // Address of the ECRecover precompile\n address constant EC_RECOVER_ADDRESS = address(0x01);\n bytes public result;\n\n function callECRecover(bytes calldata input) public {\n bool success;\n bytes memory resultInMemory;\n\n (success, resultInMemory) = EC_RECOVER_ADDRESS.call{value: 0}(input);\n\n if (success) {\n emit ECRecovered(resultInMemory);\n }\n\n result = resultInMemory;\n }\n\n function getRecoveredAddress() public view returns (address) {\n require(result.length == 32, \"Invalid result length\");\n return address(uint160(uint256(bytes32(result))));\n }\n}\n```\n\nTo interact with the ECRecover precompile, you can deploy the `ECRecoverExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment. The `callECRecover` function takes a 128-byte input combining the message `hash`, `v`, `r`, and `s` signature values. Check this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/ECRecover.js){target=\\_blank} that shows how to format this input and verify that the recovered address matches the expected result."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 3, "depth": 2, "title": "SHA-256 (0x02)", "anchor": "sha-256-0x02", "start_char": 3161, "end_char": 3843, "estimated_token_count": 194, "token_estimator": "heuristic-v1", "text": "## SHA-256 (0x02)\n\nThe SHA-256 precompile computes the SHA-256 hash of the input data.\n\n```solidity title=\"SHA256.sol\"\n\n```\n\nTo use it, you can deploy the `SHA256Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call callH256 with arbitrary bytes. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/SHA256.js){target=\\_blank} shows how to pass a UTF-8 string, hash it using the precompile, and compare it with the expected hash from Node.js's [crypto](https://www.npmjs.com/package/crypto-js){target=\\_blank} module."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 4, "depth": 2, "title": "RIPEMD-160 (0x03)", "anchor": "ripemd-160-0x03", "start_char": 3843, "end_char": 5232, "estimated_token_count": 299, "token_estimator": "heuristic-v1", "text": "## RIPEMD-160 (0x03)\n\nThe RIPEMD-160 precompile computes the RIPEMD-160 hash of the input data.\n\n```solidity title=\"RIPEMD160.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract RIPEMD160Example {\n // RIPEMD-160 precompile address\n address constant RIPEMD160_PRECOMPILE = address(0x03);\n\n bytes32 public result;\n\n event RIPEMD160Called(bytes32 result);\n\n function calculateRIPEMD160(bytes calldata input) public returns (bytes32) {\n (bool success, bytes memory returnData) = RIPEMD160_PRECOMPILE.call(\n input\n );\n require(success, \"RIPEMD-160 precompile call failed\");\n // return full 32 bytes, no assembly extraction\n bytes32 fullHash;\n assembly {\n fullHash := mload(add(returnData, 32))\n }\n result = fullHash;\n emit RIPEMD160Called(fullHash);\n return fullHash;\n }\n}\n```\n\nTo use it, you can deploy the `RIPEMD160Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `calculateRIPEMD160` with arbitrary bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/RIPEMD160.js){target=\\_blank} shows how to hash a UTF-8 string, pad the 20-byte result to 32 bytes, and verify it against the expected output."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 5, "depth": 2, "title": "Identity (Data Copy) (0x04)", "anchor": "identity-data-copy-0x04", "start_char": 5232, "end_char": 6468, "estimated_token_count": 259, "token_estimator": "heuristic-v1", "text": "## Identity (Data Copy) (0x04)\n\nThe Identity precompile simply returns the input data as output. While seemingly trivial, it can be useful for testing and certain specialized scenarios.\n\n```solidity title=\"Identity.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract IdentityExample {\n event IdentityCalled(bytes result);\n\n // Address of the Identity precompile\n address constant IDENTITY_PRECOMPILE = address(0x04);\n\n bytes public result;\n\n function callIdentity(bytes calldata input) public {\n bool success;\n bytes memory resultInMemory;\n\n (success, resultInMemory) = IDENTITY_PRECOMPILE.call(input);\n\n if (success) {\n emit IdentityCalled(resultInMemory);\n }\n\n result = resultInMemory;\n }\n}\n```\n\nTo use it, you can deploy the `IdentityExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callIdentity` with arbitrary bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Identity.js){target=\\_blank} shows how to pass input data and verify that the precompile returns it unchanged."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 6, "depth": 2, "title": "Modular Exponentiation (0x05)", "anchor": "modular-exponentiation-0x05", "start_char": 6468, "end_char": 7950, "estimated_token_count": 309, "token_estimator": "heuristic-v1", "text": "## Modular Exponentiation (0x05)\n\nThe ModExp precompile performs modular exponentiation, which is an operation commonly needed in cryptographic algorithms.\n\n```solidity title=\"ModExp.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract ModExpExample {\n address constant MODEXP_ADDRESS = address(0x05);\n\n function modularExponentiation(\n bytes memory base,\n bytes memory exponent,\n bytes memory modulus\n ) public view returns (bytes memory) {\n bytes memory input = abi.encodePacked(\n toBytes32(base.length),\n toBytes32(exponent.length),\n toBytes32(modulus.length),\n base,\n exponent,\n modulus\n );\n\n (bool success, bytes memory result) = MODEXP_ADDRESS.staticcall(input);\n require(success, \"ModExp precompile call failed\");\n\n return result;\n }\n\n function toBytes32(uint256 value) internal pure returns (bytes32) {\n return bytes32(value);\n }\n}\n```\n\nTo use it, you can deploy the `ModExpExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `modularExponentiation` with encoded `base`, `exponent`, and `modulus` bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/ModExp.js){target=\\_blank} shows how to test modular exponentiation like (4 ** 13) % 497 = 445."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 7, "depth": 2, "title": "BN128 Addition (0x06)", "anchor": "bn128-addition-0x06", "start_char": 7950, "end_char": 8599, "estimated_token_count": 157, "token_estimator": "heuristic-v1", "text": "## BN128 Addition (0x06)\n\nThe BN128Add precompile performs addition on the alt_bn128 elliptic curve, which is essential for zk-SNARK operations.\n\n```solidity title=\"BN128Add.sol\"\n\n```\n\nTo use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callBN128Add` with valid `alt_bn128` points. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Add.js){target=\\_blank} demonstrates a valid curve addition and checks the result against known expected values."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 8, "depth": 2, "title": "BN128 Scalar Multiplication (0x07)", "anchor": "bn128-scalar-multiplication-0x07", "start_char": 8599, "end_char": 9214, "estimated_token_count": 149, "token_estimator": "heuristic-v1", "text": "## BN128 Scalar Multiplication (0x07)\n\nThe BN128Mul precompile performs scalar multiplication on the alt_bn128 curve.\n\n```solidity title=\"BN128Mul.sol\"\n\n```\n\nTo use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `bn128ScalarMul` with a valid point and scalar. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Mul.js){target=\\_blank} shows how to test the operation and verify the expected scalar multiplication result on `alt_bn128`."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 9, "depth": 2, "title": "BN128 Pairing Check (0x08)", "anchor": "bn128-pairing-check-0x08", "start_char": 9214, "end_char": 9764, "estimated_token_count": 135, "token_estimator": "heuristic-v1", "text": "## BN128 Pairing Check (0x08)\n\nThe BN128Pairing precompile verifies a pairing equation on the alt_bn128 curve, which is critical for zk-SNARK verification.\n\n```solidity title=\"BN128Pairing.sol\"\n\n```\n\nYou can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or your preferred environment. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Pairing.js){target=\\_blank} contains these tests with working examples."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 10, "depth": 2, "title": "Blake2F (0x09)", "anchor": "blake2f-0x09", "start_char": 9764, "end_char": 10518, "estimated_token_count": 175, "token_estimator": "heuristic-v1", "text": "## Blake2F (0x09)\n\nThe Blake2F precompile performs the Blake2 compression function F, which is the core of the Blake2 hash function.\n\n```solidity title=\"Blake2F.sol\"\n\n```\n\nTo use it, deploy `Blake2FExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callBlake2F` with the properly formatted input parameters for rounds, state vector, message block, offset counters, and final block flag. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Blake2.js){target=\\_blank} demonstrates how to perform Blake2 compression with different rounds and verify the correctness of the output against known test vectors."} -{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 11, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 10518, "end_char": 11136, "estimated_token_count": 92, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nPrecompiles in Polkadot Hub provide efficient, native implementations of cryptographic functions and other commonly used operations. By understanding how to interact with these precompiles from your Solidity contracts, you can build more efficient and feature-rich applications on the Polkadot ecosystem.\n\nThe examples provided in this guide demonstrate the basic patterns for interacting with each precompile. Developers can adapt these patterns to their specific use cases, leveraging the performance benefits of native implementations while maintaining the flexibility of smart contract development."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 3, "depth": 2, "title": "SHA-256 (0x02)", "anchor": "sha-256-0x02", "start_char": 3161, "end_char": 4399, "estimated_token_count": 294, "token_estimator": "heuristic-v1", "text": "## SHA-256 (0x02)\n\nThe SHA-256 precompile computes the SHA-256 hash of the input data.\n\n```solidity title=\"SHA256.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract SHA256Example {\n event SHA256Called(bytes result);\n\n // Address of the SHA256 precompile\n address constant SHA256_PRECOMPILE = address(0x02);\n\n bytes public result;\n\n function callH256(bytes calldata input) public {\n bool success;\n bytes memory resultInMemory;\n\n (success, resultInMemory) = SHA256_PRECOMPILE.call{value: 0}(input);\n\n if (success) {\n emit SHA256Called(resultInMemory);\n }\n\n result = resultInMemory;\n }\n}\n```\n\nTo use it, you can deploy the `SHA256Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call callH256 with arbitrary bytes. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/SHA256.js){target=\\_blank} shows how to pass a UTF-8 string, hash it using the precompile, and compare it with the expected hash from Node.js's [crypto](https://www.npmjs.com/package/crypto-js){target=\\_blank} module."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 4, "depth": 2, "title": "RIPEMD-160 (0x03)", "anchor": "ripemd-160-0x03", "start_char": 4399, "end_char": 5788, "estimated_token_count": 299, "token_estimator": "heuristic-v1", "text": "## RIPEMD-160 (0x03)\n\nThe RIPEMD-160 precompile computes the RIPEMD-160 hash of the input data.\n\n```solidity title=\"RIPEMD160.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract RIPEMD160Example {\n // RIPEMD-160 precompile address\n address constant RIPEMD160_PRECOMPILE = address(0x03);\n\n bytes32 public result;\n\n event RIPEMD160Called(bytes32 result);\n\n function calculateRIPEMD160(bytes calldata input) public returns (bytes32) {\n (bool success, bytes memory returnData) = RIPEMD160_PRECOMPILE.call(\n input\n );\n require(success, \"RIPEMD-160 precompile call failed\");\n // return full 32 bytes, no assembly extraction\n bytes32 fullHash;\n assembly {\n fullHash := mload(add(returnData, 32))\n }\n result = fullHash;\n emit RIPEMD160Called(fullHash);\n return fullHash;\n }\n}\n```\n\nTo use it, you can deploy the `RIPEMD160Example` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `calculateRIPEMD160` with arbitrary bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/RIPEMD160.js){target=\\_blank} shows how to hash a UTF-8 string, pad the 20-byte result to 32 bytes, and verify it against the expected output."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 5, "depth": 2, "title": "Identity (Data Copy) (0x04)", "anchor": "identity-data-copy-0x04", "start_char": 5788, "end_char": 7024, "estimated_token_count": 259, "token_estimator": "heuristic-v1", "text": "## Identity (Data Copy) (0x04)\n\nThe Identity precompile simply returns the input data as output. While seemingly trivial, it can be useful for testing and certain specialized scenarios.\n\n```solidity title=\"Identity.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract IdentityExample {\n event IdentityCalled(bytes result);\n\n // Address of the Identity precompile\n address constant IDENTITY_PRECOMPILE = address(0x04);\n\n bytes public result;\n\n function callIdentity(bytes calldata input) public {\n bool success;\n bytes memory resultInMemory;\n\n (success, resultInMemory) = IDENTITY_PRECOMPILE.call(input);\n\n if (success) {\n emit IdentityCalled(resultInMemory);\n }\n\n result = resultInMemory;\n }\n}\n```\n\nTo use it, you can deploy the `IdentityExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callIdentity` with arbitrary bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Identity.js){target=\\_blank} shows how to pass input data and verify that the precompile returns it unchanged."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 6, "depth": 2, "title": "Modular Exponentiation (0x05)", "anchor": "modular-exponentiation-0x05", "start_char": 7024, "end_char": 8506, "estimated_token_count": 309, "token_estimator": "heuristic-v1", "text": "## Modular Exponentiation (0x05)\n\nThe ModExp precompile performs modular exponentiation, which is an operation commonly needed in cryptographic algorithms.\n\n```solidity title=\"ModExp.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract ModExpExample {\n address constant MODEXP_ADDRESS = address(0x05);\n\n function modularExponentiation(\n bytes memory base,\n bytes memory exponent,\n bytes memory modulus\n ) public view returns (bytes memory) {\n bytes memory input = abi.encodePacked(\n toBytes32(base.length),\n toBytes32(exponent.length),\n toBytes32(modulus.length),\n base,\n exponent,\n modulus\n );\n\n (bool success, bytes memory result) = MODEXP_ADDRESS.staticcall(input);\n require(success, \"ModExp precompile call failed\");\n\n return result;\n }\n\n function toBytes32(uint256 value) internal pure returns (bytes32) {\n return bytes32(value);\n }\n}\n```\n\nTo use it, you can deploy the `ModExpExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `modularExponentiation` with encoded `base`, `exponent`, and `modulus` bytes. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/ModExp.js){target=\\_blank} shows how to test modular exponentiation like (4 ** 13) % 497 = 445."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 7, "depth": 2, "title": "BN128 Addition (0x06)", "anchor": "bn128-addition-0x06", "start_char": 8506, "end_char": 10020, "estimated_token_count": 343, "token_estimator": "heuristic-v1", "text": "## BN128 Addition (0x06)\n\nThe BN128Add precompile performs addition on the alt_bn128 elliptic curve, which is essential for zk-SNARK operations.\n\n```solidity title=\"BN128Add.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.20;\n\ncontract BN128AddExample {\n address constant BN128_ADD_PRECOMPILE = address(0x06);\n\n event BN128Added(uint256 x3, uint256 y3);\n\n uint256 public resultX;\n uint256 public resultY;\n\n function callBN128Add(uint256 x1, uint256 y1, uint256 x2, uint256 y2) public {\n bytes memory input = abi.encodePacked(\n bytes32(x1), bytes32(y1), bytes32(x2), bytes32(y2)\n );\n\n bool success;\n bytes memory output;\n\n (success, output) = BN128_ADD_PRECOMPILE.call{value: 0}(input);\n\n require(success, \"BN128Add precompile call failed\");\n require(output.length == 64, \"Invalid output length\");\n\n (uint256 x3, uint256 y3) = abi.decode(output, (uint256, uint256));\n\n resultX = x3;\n resultY = y3;\n\n emit BN128Added(x3, y3);\n }\n}\n```\n\nTo use it, you can deploy the `BN128AddExample` contract in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callBN128Add` with valid `alt_bn128` points. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Add.js){target=\\_blank} demonstrates a valid curve addition and checks the result against known expected values."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 8, "depth": 2, "title": "BN128 Scalar Multiplication (0x07)", "anchor": "bn128-scalar-multiplication-0x07", "start_char": 10020, "end_char": 11756, "estimated_token_count": 369, "token_estimator": "heuristic-v1", "text": "## BN128 Scalar Multiplication (0x07)\n\nThe BN128Mul precompile performs scalar multiplication on the alt_bn128 curve.\n\n```solidity title=\"BN128Mul.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract BN128MulExample {\n // Precompile address for BN128Mul\n address constant BN128_MUL_ADDRESS = address(0x07);\n\n bytes public result;\n\n // Performs scalar multiplication of a point on the alt_bn128 curve\n function bn128ScalarMul(uint256 x1, uint256 y1, uint256 scalar) public {\n // Format: [x, y, scalar] - each 32 bytes\n bytes memory input = abi.encodePacked(\n bytes32(x1),\n bytes32(y1),\n bytes32(scalar)\n );\n\n (bool success, bytes memory resultInMemory) = BN128_MUL_ADDRESS.call{\n value: 0\n }(input);\n require(success, \"BN128Mul precompile call failed\");\n\n result = resultInMemory;\n }\n\n // Helper to decode result from `result` storage\n function getResult() public view returns (uint256 x2, uint256 y2) {\n bytes memory tempResult = result;\n require(tempResult.length >= 64, \"Invalid result length\");\n assembly {\n x2 := mload(add(tempResult, 32))\n y2 := mload(add(tempResult, 64))\n }\n }\n}\n```\n\nTo use it, deploy `BN128MulExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `bn128ScalarMul` with a valid point and scalar. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Mul.js){target=\\_blank} shows how to test the operation and verify the expected scalar multiplication result on `alt_bn128`."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 9, "depth": 2, "title": "BN128 Pairing Check (0x08)", "anchor": "bn128-pairing-check-0x08", "start_char": 11756, "end_char": 13264, "estimated_token_count": 313, "token_estimator": "heuristic-v1", "text": "## BN128 Pairing Check (0x08)\n\nThe BN128Pairing precompile verifies a pairing equation on the alt_bn128 curve, which is critical for zk-SNARK verification.\n\n```solidity title=\"BN128Pairing.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract BN128PairingExample {\n // Precompile address for BN128Pairing\n address constant BN128_PAIRING_ADDRESS = address(0x08);\n\n bytes public result;\n\n // Performs a pairing check on the alt_bn128 curve\n function bn128Pairing(bytes memory input) public {\n // Call the precompile\n (bool success, bytes memory resultInMemory) = BN128_PAIRING_ADDRESS\n .call{value: 0}(input);\n require(success, \"BN128Pairing precompile call failed\");\n\n result = resultInMemory;\n }\n\n // Helper function to decode the result from `result` storage\n function getResult() public view returns (bool isValid) {\n bytes memory tempResult = result;\n require(tempResult.length == 32, \"Invalid result length\");\n\n uint256 output;\n assembly {\n output := mload(add(tempResult, 32))\n }\n\n isValid = (output == 1);\n }\n}\n```\n\nYou can deploy `BN128PairingExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or your preferred environment. Check out this [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/BN128Pairing.js){target=\\_blank} contains these tests with working examples."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 10, "depth": 2, "title": "Blake2F (0x09)", "anchor": "blake2f-0x09", "start_char": 13264, "end_char": 17391, "estimated_token_count": 945, "token_estimator": "heuristic-v1", "text": "## Blake2F (0x09)\n\nThe Blake2F precompile performs the Blake2 compression function F, which is the core of the Blake2 hash function.\n\n```solidity title=\"Blake2F.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.0;\n\ncontract Blake2FExample {\n // Precompile address for Blake2F\n address constant BLAKE2F_ADDRESS = address(0x09);\n\n bytes public result;\n\n function blake2F(bytes memory input) public {\n // Input must be exactly 213 bytes\n require(input.length == 213, \"Invalid input length - must be 213 bytes\");\n\n // Call the precompile\n (bool success, bytes memory resultInMemory) = BLAKE2F_ADDRESS.call{\n value: 0\n }(input);\n require(success, \"Blake2F precompile call failed\");\n\n result = resultInMemory;\n }\n\n // Helper function to decode the result from `result` storage\n function getResult() public view returns (bytes32[8] memory output) {\n bytes memory tempResult = result;\n require(tempResult.length == 64, \"Invalid result length\");\n\n for (uint i = 0; i < 8; i++) {\n assembly {\n mstore(add(output, mul(32, i)), mload(add(add(tempResult, 32), mul(32, i))))\n }\n }\n }\n\n\n // Helper function to create Blake2F input from parameters\n function createBlake2FInput(\n uint32 rounds,\n bytes32[8] memory h,\n bytes32[16] memory m,\n bytes8[2] memory t,\n bool f\n ) public pure returns (bytes memory) {\n // Start with rounds (4 bytes, big-endian)\n bytes memory input = abi.encodePacked(rounds);\n\n // Add state vector h (8 * 32 = 256 bytes)\n for (uint i = 0; i < 8; i++) {\n input = abi.encodePacked(input, h[i]);\n }\n\n // Add message block m (16 * 32 = 512 bytes, but we need to convert to 16 * 8 = 128 bytes)\n // Blake2F expects 64-bit words in little-endian format\n for (uint i = 0; i < 16; i++) {\n // Take only the first 8 bytes of each bytes32 and reverse for little-endian\n bytes8 word = bytes8(m[i]);\n input = abi.encodePacked(input, word);\n }\n\n // Add offset counters t (2 * 8 = 16 bytes)\n input = abi.encodePacked(input, t[0], t[1]);\n\n // Add final block flag (1 byte)\n input = abi.encodePacked(input, f ? bytes1(0x01) : bytes1(0x00));\n\n return input;\n }\n\n // Simplified function that works with raw hex input\n function blake2FFromHex(string memory hexInput) public {\n bytes memory input = hexStringToBytes(hexInput);\n blake2F(input);\n }\n\n // Helper function to convert hex string to bytes\n function hexStringToBytes(string memory hexString) public pure returns (bytes memory) {\n bytes memory hexBytes = bytes(hexString);\n require(hexBytes.length % 2 == 0, \"Invalid hex string length\");\n \n bytes memory result = new bytes(hexBytes.length / 2);\n \n for (uint i = 0; i < hexBytes.length / 2; i++) {\n result[i] = bytes1(\n (hexCharToByte(hexBytes[2 * i]) << 4) | \n hexCharToByte(hexBytes[2 * i + 1])\n );\n }\n \n return result;\n }\n\n function hexCharToByte(bytes1 char) internal pure returns (uint8) {\n uint8 c = uint8(char);\n if (c >= 48 && c <= 57) return c - 48; // 0-9\n if (c >= 65 && c <= 70) return c - 55; // A-F\n if (c >= 97 && c <= 102) return c - 87; // a-f\n revert(\"Invalid hex character\");\n }\n}\n```\n\nTo use it, deploy `Blake2FExample` in [Remix](/develop/smart-contracts/dev-environments/remix){target=\\_blank} or any Solidity-compatible environment and call `callBlake2F` with the properly formatted input parameters for rounds, state vector, message block, offset counters, and final block flag. This [test file](https://github.com/polkadot-developers/polkavm-hardhat-examples/blob/v0.0.3/precompiles-hardhat/test/Blake2.js){target=\\_blank} demonstrates how to perform Blake2 compression with different rounds and verify the correctness of the output against known test vectors."} +{"page_id": "develop-smart-contracts-precompiles-interact-with-precompiles", "page_title": "Interact with Precompiles", "index": 11, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 17391, "end_char": 18009, "estimated_token_count": 92, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nPrecompiles in Polkadot Hub provide efficient, native implementations of cryptographic functions and other commonly used operations. By understanding how to interact with these precompiles from your Solidity contracts, you can build more efficient and feature-rich applications on the Polkadot ecosystem.\n\nThe examples provided in this guide demonstrate the basic patterns for interacting with each precompile. Developers can adapt these patterns to their specific use cases, leveraging the performance benefits of native implementations while maintaining the flexibility of smart contract development."} {"page_id": "develop-smart-contracts-precompiles-xcm-precompile", "page_title": "Interact with the XCM Precompile", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 18, "end_char": 913, "estimated_token_count": 191, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nThe [XCM (Cross-Consensus Message)](/develop/interoperability/intro-to-xcm){target=\\_blank} precompile enables Polkadot Hub developers to access XCM functionality directly from their smart contracts using a Solidity interface.\n\nLocated at the fixed address `0x00000000000000000000000000000000000a0000`, the XCM precompile offers three primary functions:\n\n- **`execute`**: For local XCM execution.\n- **`send`**: For cross-chain message transmission.\n- **`weighMessage`**: For cost estimation.\n\nThis guide demonstrates how to interact with the XCM precompile through Solidity smart contracts using [Remix IDE](/develop/smart-contracts/dev-environments/remix){target=\\_blank}.\n\n!!!note\n The XCM precompile provides the barebones XCM functionality. While it provides a lot of flexibility, it doesn't provide abstractions to hide away XCM details. These have to be built on top."} {"page_id": "develop-smart-contracts-precompiles-xcm-precompile", "page_title": "Interact with the XCM Precompile", "index": 1, "depth": 2, "title": "Precompile Interface", "anchor": "precompile-interface", "start_char": 913, "end_char": 4064, "estimated_token_count": 708, "token_estimator": "heuristic-v1", "text": "## Precompile Interface\n\nThe XCM precompile implements the `IXcm` interface, which defines the structure for interacting with XCM functionality. The source code for the interface is as follows:\n\n```solidity title=\"IXcm.sol\"\n// SPDX-License-Identifier: MIT\npragma solidity ^0.8.20;\n\n/// @dev The on-chain address of the XCM (Cross-Consensus Messaging) precompile.\naddress constant XCM_PRECOMPILE_ADDRESS = address(0xA0000);\n\n/// @title XCM Precompile Interface\n/// @notice A low-level interface for interacting with `pallet_xcm`.\n/// It forwards calls directly to the corresponding dispatchable functions,\n/// providing access to XCM execution and message passing.\n/// @dev Documentation:\n/// @dev - XCM: https://docs.polkadot.com/develop/interoperability\n/// @dev - SCALE codec: https://docs.polkadot.com/polkadot-protocol/parachain-basics/data-encoding\n/// @dev - Weights: https://docs.polkadot.com/polkadot-protocol/parachain-basics/blocks-transactions-fees/fees/#transactions-weights-and-fees\ninterface IXcm {\n /// @notice Weight v2 used for measurement for an XCM execution\n struct Weight {\n /// @custom:property The computational time used to execute some logic based on reference hardware.\n uint64 refTime;\n /// @custom:property The size of the proof needed to execute some logic.\n uint64 proofSize;\n }\n\n /// @notice Executes an XCM message locally on the current chain with the caller's origin.\n /// @dev Internally calls `pallet_xcm::execute`.\n /// @param message A SCALE-encoded Versioned XCM message.\n /// @param weight The maximum allowed `Weight` for execution.\n /// @dev Call @custom:function weighMessage(message) to ensure sufficient weight allocation.\n function execute(bytes calldata message, Weight calldata weight) external;\n\n /// @notice Sends an XCM message to another parachain or consensus system.\n /// @dev Internally calls `pallet_xcm::send`.\n /// @param destination SCALE-encoded destination MultiLocation.\n /// @param message SCALE-encoded Versioned XCM message.\n function send(bytes calldata destination, bytes calldata message) external;\n\n /// @notice Estimates the `Weight` required to execute a given XCM message.\n /// @param message SCALE-encoded Versioned XCM message to analyze.\n /// @return weight Struct containing estimated `refTime` and `proofSize`.\n function weighMessage(bytes calldata message) external view returns (Weight memory weight);\n}\n```\n\nThe interface defines a `Weight` struct that represents the computational cost of XCM operations. Weight has two components: \n\n- **`refTime`**: Computational time on reference hardware.\n- **`proofSize`**: The size of the proof required for execution.\n\nAll XCM messages must be encoded using the [SCALE codec](/polkadot-protocol/parachain-basics/data-encoding/#data-encoding){target=\\_blank}, Polkadot's standard serialization format.\n\nFor further information, check the [`precompiles/IXCM.sol`](https://github.com/paritytech/polkadot-sdk/blob/cb629d46ebf00aa65624013a61f9c69ebf02b0b4/polkadot/xcm/pallet-xcm/src/precompiles/IXcm.sol){target=\\_blank} file present in `pallet-xcm`."} {"page_id": "develop-smart-contracts-precompiles-xcm-precompile", "page_title": "Interact with the XCM Precompile", "index": 2, "depth": 2, "title": "Interact with the XCM Precompile", "anchor": "interact-with-the-xcm-precompile", "start_char": 4064, "end_char": 5303, "estimated_token_count": 306, "token_estimator": "heuristic-v1", "text": "## Interact with the XCM Precompile\n\nTo interact with the XCM precompile, you can use the precompile interface directly in [Remix IDE](/develop/smart-contracts/dev-environments/remix/){target=\\_blank}:\n\n1. Create a new file called `IXcm.sol` in Remix.\n2. Copy and paste the `IXcm` interface code into the file.\n3. Compile the interface by selecting the button or using **Ctrl +S** keys:\n\n ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-01.webp)\n\n4. In the **Deploy & Run Transactions** tab, select the `IXcm` interface from the contract dropdown.\n5. Enter the precompile address `0x00000000000000000000000000000000000a0000` in the **At Address** input field.\n6. Select the **At Address** button to connect to the precompile.\n\n ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-02.webp)\n\n7. Once connected, you can use the Remix interface to interact with the XCM precompile's `execute`, `send`, and `weighMessage` functions.\n\n ![](/images/develop/smart-contracts/precompiles/xcm-precompile/xcm-precompile-03.webp)\n\nThe main entrypoint of the precompile is the `execute` function. However, it's necessary to first call `weighMessage` to fill in the required parameters."} @@ -786,8 +786,8 @@ {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 8, "depth": 3, "title": "Activate using Polkadot.js Apps", "anchor": "activate-using-polkadotjs-apps", "start_char": 9803, "end_char": 11084, "estimated_token_count": 345, "token_estimator": "heuristic-v1", "text": "### Activate using Polkadot.js Apps\n\nFollow these steps to use Polkadot.js Apps to activate your validator:\n\n1. In Polkadot.js Apps, navigate to **Network** and select **Staking**:\n\n ![](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-01.webp)\n\n2. Open the **Accounts** tab and click on **+ Validator**:\n\n ![](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-02.webp)\n\n3. Set a bond amount in the **value bonded** field and then click **next**:\n\n ![](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-03.webp)\n\n4. **Set session keys**. Paste the output from `author_rotateKeys` (hex-encoded) to link your validator with its session keys. Then click **Bond & Validate** to continue:\n\n ![](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-04.webp)\n\nYou can set the **commission** and the **blocked** option via `staking.validate` extrinsic. By default, the blocked option is set to FALSE (i.e., the validator accepts nominations):\n\n![](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-05.webp)"} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 9, "depth": 3, "title": "Monitor Validation Status and Slots", "anchor": "monitor-validation-status-and-slots", "start_char": 11084, "end_char": 12036, "estimated_token_count": 217, "token_estimator": "heuristic-v1", "text": "### Monitor Validation Status and Slots\n\nOn the [**Staking**](https://polkadot.js.org/apps/#/staking){target=\\_blank} tab in Polkadot.js Apps, you can see your validator's status, the number of available validator slots, and the nodes that have signaled their intent to validate. Your node may initially appear in the waiting queue, especially if the validator slots are full. The following is an example view of the **Staking** tab:\n\n![staking queue](/images/infrastructure/running-a-validator/onboarding-and-offboarding/start-validating/start-validating-06.webp)\n\nThe validator set refreshes each era. If there's an available slot in the next era, your node may be selected to move from the waiting queue to the active validator set, allowing it to start validating blocks. If your validator is not selected, it remains in the waiting queue. Increasing your stake or gaining more nominators may improve your chance of being selected in future eras."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 10, "depth": 2, "title": "Run a Validator Using Systemd", "anchor": "run-a-validator-using-systemd", "start_char": 12036, "end_char": 13060, "estimated_token_count": 218, "token_estimator": "heuristic-v1", "text": "## Run a Validator Using Systemd\n\nRunning your Polkadot validator as a [systemd](https://en.wikipedia.org/wiki/Systemd){target=\\_blank} service is an effective way to ensure its high uptime and reliability. Using systemd allows your validator to automatically restart after server reboots or unexpected crashes, significantly reducing the risk of slashing due to downtime.\n\nThis following sections will walk you through creating and managing a systemd service for your validator, allowing you to seamlessly monitor and control it as part of your Linux system. \n\nEnsure the following requirements are met before proceeding with the systemd setup:\n\n- Confirm your system meets the [requirements](/infrastructure/running-a-validator/requirements/){target=\\_blank} for running a validator.\n- Ensure you meet the [minimum bond requirements](https://wiki.polkadot.com/general/chain-state-values/#minimum-validator-bond){target=\\_blank} for validating.\n- Verify the Polkadot binary is [installed](#install-the-polkadot-binaries)."} -{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 11, "depth": 3, "title": "Create the Systemd Service File", "anchor": "create-the-systemd-service-file", "start_char": 13060, "end_char": 13830, "estimated_token_count": 178, "token_estimator": "heuristic-v1", "text": "### Create the Systemd Service File\n\nFirst create a new unit file called `polkadot-validator.service` in `/etc/systemd/system/`:\n\n```bash\ntouch /etc/systemd/system/polkadot-validator.service\n```\n\nIn this unit file, you will write the commands that you want to run on server boot/restart:\n\n```systemd title=\"/etc/systemd/system/polkadot-validator.service\"\n\n```\n\n!!! warning \"Restart delay and equivocation risk\"\n It is recommended that a node's restart be delayed with `RestartSec` in the case of a crash. It's possible that when a node crashes, consensus votes in GRANDPA aren't persisted to disk. In this case, there is potential to equivocate when immediately restarting. Delaying the restart will allow the network to progress past potentially conflicting votes."} -{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 12, "depth": 3, "title": "Run the Service", "anchor": "run-the-service", "start_char": 13830, "end_char": 14805, "estimated_token_count": 243, "token_estimator": "heuristic-v1", "text": "### Run the Service\n\nActivate the systemd service to start on system boot by running:\n\n```bash\nsystemctl enable polkadot-validator.service\n```\n\nTo start the service manually, use:\n\n```bash\nsystemctl start polkadot-validator.service\n```\n\nCheck the service's status to confirm it is running:\n\n```bash\nsystemctl status polkadot-validator.service\n```\n\nTo view the logs in real-time, use [journalctl](https://www.freedesktop.org/software/systemd/man/latest/journalctl.html){target=\\_blank} like so:\n\n```bash\njournalctl -f -u polkadot-validator\n```\n\nWith these steps, you can effectively manage and monitor your validator as a systemd service.\n\nOnce your validator is active, it's officially part of Polkadot's security infrastructure. For questions or further support, you can reach out to the [Polkadot Validator chat](https://matrix.to/#/!NZrbtteFeqYKCUGQtr:matrix.parity.io?via=matrix.parity.io&via=matrix.org&via=web3.foundation){target=\\_blank} for tips and troubleshooting."} +{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 11, "depth": 3, "title": "Create the Systemd Service File", "anchor": "create-the-systemd-service-file", "start_char": 13060, "end_char": 14799, "estimated_token_count": 338, "token_estimator": "heuristic-v1", "text": "### Create the Systemd Service File\n\nFirst create a new unit file called `polkadot-validator.service` in `/etc/systemd/system/`:\n\n```bash\ntouch /etc/systemd/system/polkadot-validator.service\n```\n\nIn this unit file, you will write the commands that you want to run on server boot/restart:\n\n```systemd title=\"/etc/systemd/system/polkadot-validator.service\"\n[Unit]\nDescription=Polkadot Node\nAfter=network.target\nDocumentation=https://github.com/paritytech/polkadot-sdk\n\n[Service]\nEnvironmentFile=-/etc/default/polkadot\nExecStart=/usr/bin/polkadot $POLKADOT_CLI_ARGS\nUser=polkadot\nGroup=polkadot\nRestart=always\nRestartSec=120\nCapabilityBoundingSet=\nLockPersonality=true\nNoNewPrivileges=true\nPrivateDevices=true\nPrivateMounts=true\nPrivateTmp=true\nPrivateUsers=true\nProtectClock=true\nProtectControlGroups=true\nProtectHostname=true\nProtectKernelModules=true\nProtectKernelTunables=true\nProtectSystem=strict\nRemoveIPC=true\nRestrictAddressFamilies=AF_INET AF_INET6 AF_NETLINK AF_UNIX\nRestrictNamespaces=false\nRestrictSUIDSGID=true\nSystemCallArchitectures=native\nSystemCallFilter=@system-service\nSystemCallFilter=landlock_add_rule landlock_create_ruleset landlock_restrict_self seccomp mount umount2\nSystemCallFilter=~@clock @module @reboot @swap @privileged\nSystemCallFilter=pivot_root\nUMask=0027\n\n[Install]\nWantedBy=multi-user.target\n```\n\n!!! warning \"Restart delay and equivocation risk\"\n It is recommended that a node's restart be delayed with `RestartSec` in the case of a crash. It's possible that when a node crashes, consensus votes in GRANDPA aren't persisted to disk. In this case, there is potential to equivocate when immediately restarting. Delaying the restart will allow the network to progress past potentially conflicting votes."} +{"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-start-validating", "page_title": "Start Validating", "index": 12, "depth": 3, "title": "Run the Service", "anchor": "run-the-service", "start_char": 14799, "end_char": 15774, "estimated_token_count": 243, "token_estimator": "heuristic-v1", "text": "### Run the Service\n\nActivate the systemd service to start on system boot by running:\n\n```bash\nsystemctl enable polkadot-validator.service\n```\n\nTo start the service manually, use:\n\n```bash\nsystemctl start polkadot-validator.service\n```\n\nCheck the service's status to confirm it is running:\n\n```bash\nsystemctl status polkadot-validator.service\n```\n\nTo view the logs in real-time, use [journalctl](https://www.freedesktop.org/software/systemd/man/latest/journalctl.html){target=\\_blank} like so:\n\n```bash\njournalctl -f -u polkadot-validator\n```\n\nWith these steps, you can effectively manage and monitor your validator as a systemd service.\n\nOnce your validator is active, it's officially part of Polkadot's security infrastructure. For questions or further support, you can reach out to the [Polkadot Validator chat](https://matrix.to/#/!NZrbtteFeqYKCUGQtr:matrix.parity.io?via=matrix.parity.io&via=matrix.org&via=web3.foundation){target=\\_blank} for tips and troubleshooting."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-stop-validating", "page_title": "Stop Validating", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 19, "end_char": 498, "estimated_token_count": 89, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIf you're ready to stop validating on Polkadot, there are essential steps to ensure a smooth transition while protecting your funds and account integrity. Whether you're taking a break for maintenance or unbonding entirely, you'll need to chill your validator, purge session keys, and unbond your tokens. This guide explains how to use Polkadot's tools and extrinsics to safely withdraw from validation activities, safeguarding your account's future usability."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-stop-validating", "page_title": "Stop Validating", "index": 1, "depth": 2, "title": "Pause Versus Stop", "anchor": "pause-versus-stop", "start_char": 498, "end_char": 920, "estimated_token_count": 83, "token_estimator": "heuristic-v1", "text": "## Pause Versus Stop\n\nIf you wish to remain a validator or nominator (for example, stopping for planned downtime or server maintenance), submitting the `chill` extrinsic in the `staking` pallet should suffice. Additional steps are only needed to unbond funds or reap an account.\n\nThe following are steps to ensure a smooth stop to validation:\n\n- Chill the validator.\n- Purge validator session keys.\n- Unbond your tokens."} {"page_id": "infrastructure-running-a-validator-onboarding-and-offboarding-stop-validating", "page_title": "Stop Validating", "index": 2, "depth": 2, "title": "Chill Validator", "anchor": "chill-validator", "start_char": 920, "end_char": 1499, "estimated_token_count": 117, "token_estimator": "heuristic-v1", "text": "## Chill Validator\n\nWhen stepping back from validating, the first step is to chill your validator status. This action stops your validator from being considered for the next era without fully unbonding your tokens, which can be useful for temporary pauses like maintenance or planned downtime.\n\nUse the `staking.chill` extrinsic to initiate this. For more guidance on chilling your node, refer to the [Pause Validating](/infrastructure/running-a-validator/operational-tasks/pause-validating/){target=\\_blank} guide. You may also claim any pending staking rewards at this point."} @@ -1040,19 +1040,19 @@ {"page_id": "polkadot-protocol-onchain-governance", "page_title": "On-Chain Governance", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 2065, "end_char": 2114, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 12, "end_char": 597, "estimated_token_count": 101, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nAccounts are essential for managing identity, transactions, and governance on the network in the Polkadot SDK. Understanding these components is critical for seamless development and operation on the network, whether you're building or interacting with Polkadot-based chains.\n\nThis page will guide you through the essential aspects of accounts, including their data structure, balance types, reference counters, and address formats. You’ll learn how accounts are managed within the runtime, how balances are categorized, and how addresses are encoded and validated."} {"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 1, "depth": 2, "title": "Account Data Structure", "anchor": "account-data-structure", "start_char": 597, "end_char": 862, "estimated_token_count": 42, "token_estimator": "heuristic-v1", "text": "## Account Data Structure\n\nAccounts are foundational to any blockchain, and the Polkadot SDK provides a flexible management system. This section explains how the Polkadot SDK defines accounts and manages their lifecycle through data structures within the runtime."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 2, "depth": 3, "title": "Account", "anchor": "account", "start_char": 862, "end_char": 2898, "estimated_token_count": 501, "token_estimator": "heuristic-v1", "text": "### Account\n\nThe [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/type.Account.html){target=\\_blank} is a storage map within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\\_blank} that links an account ID to its corresponding data. This structure is fundamental for mapping account-related information within the chain.\n\nThe code snippet below shows how accounts are defined:\n\n```rs\n \n```\n\nThe preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`).\n\nThe `StorageMap` consists of the following parameters:\n\n- **`_`**: Used in macro expansion and acts as a placeholder for the storage prefix type. Tells the macro to insert the default prefix during expansion.\n- **`Blake2_128Concat`**: The hashing function applied to keys in the storage map.\n- **`T: :AccountId`**: Represents the key type, which corresponds to the account’s unique ID.\n- **`AccountInfo`**: The value type stored in the map. For each account ID, the map stores an `AccountInfo` struct containing:\n\n - **`T::Nonce`**: A nonce for the account, which is incremented with each transaction to ensure transaction uniqueness.\n - **`T: :AccountData`**: Custom account data defined by the runtime configuration, which could include balances, locked funds, or other relevant information.\n \n- **`ValueQuery`**: Defines how queries to the storage map behave when no value is found; returns a default value instead of `None`.\n\nFor a detailed explanation of storage maps, see the [`StorageMap`](https://paritytech.github.io/polkadot-sdk/master/frame_support/storage/types/struct.StorageMap.html){target=\\_blank} entry in the Rust docs."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 3, "depth": 3, "title": "Account Info", "anchor": "account-info", "start_char": 2898, "end_char": 4651, "estimated_token_count": 407, "token_estimator": "heuristic-v1", "text": "### Account Info\n\nThe `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules.\n\n```rs\n\n```\n\nThe `AccountInfo` structure includes the following components:\n\n- **`nonce`**: Tracks the number of transactions initiated by the account, which ensures transaction uniqueness and prevents replay attacks.\n- **`consumers`**: Counts how many other modules or pallets rely on this account’s existence. The account cannot be removed from the chain (reaped) until this count reaches zero.\n- **`providers`**: Tracks how many modules permit this account’s existence. An account can only be reaped once both `providers` and `sufficients` are zero.\n- **`sufficients`**: Represents the number of modules that allow the account to exist for internal purposes, independent of any other modules.\n- **`AccountData`**: A flexible data structure that can be customized in the runtime configuration, usually containing balances or other user-specific data.\n\nThis structure helps manage an account's state and prevents its premature removal while it is still referenced by other on-chain data or modules. The [`AccountInfo`](https://paritytech.github.io/polkadot-sdk/master/frame_system/struct.AccountInfo.html){target=\\_blank} structure can vary as long as it satisfies the trait bounds defined by the `AccountData` associated type in the [`frame-system::pallet::Config`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/trait.Config.html){target=\\_blank} trait."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 4, "depth": 3, "title": "Account Reference Counters", "anchor": "account-reference-counters", "start_char": 4651, "end_char": 9744, "estimated_token_count": 1040, "token_estimator": "heuristic-v1", "text": "### Account Reference Counters\n\nPolkadot SDK uses reference counters to track an account’s dependencies across different runtime modules. These counters ensure that accounts remain active while data is associated with them.\n\nThe reference counters include:\n\n- **`consumers`**: Prevents account removal while other pallets still rely on the account.\n- **`providers`**: Ensures an account is active before other pallets store data related to it.\n- **`sufficients`**: Indicates the account’s independence, ensuring it can exist even without a native token balance, such as when holding sufficient alternative assets.\n\n#### Providers Reference Counters\n\nThe `providers` counter ensures that an account is ready to be depended upon by other runtime modules. For example, it is incremented when an account has a balance above the existential deposit, which marks the account as active.\n\nThe system requires this reference counter to be greater than zero for the `consumers` counter to be incremented, ensuring the account is stable before any dependencies are added.\n\n#### Consumers Reference Counters\n\nThe `consumers` counter ensures that the account cannot be reaped until all references to it across the runtime have been removed. This check prevents the accidental deletion of accounts that still have active on-chain data.\n\nIt is the user’s responsibility to clear out any data from other runtime modules if they wish to remove their account and reclaim their existential deposit.\n\n#### Sufficients Reference Counter\n\nThe `sufficients` counter tracks accounts that can exist independently without relying on a native account balance. This is useful for accounts holding other types of assets, like tokens, without needing a minimum balance in the native token.\n\nFor instance, the [Assets pallet](https://paritytech.github.io/polkadot-sdk/master/pallet_assets/index.html){target=\\_blank}, may increment this counter for an account holding sufficient tokens.\n\n#### Account Deactivation\n\nIn Polkadot SDK-based chains, an account is deactivated when its reference counters (such as `providers`, `consumers`, and `sufficient`) reach zero. These counters ensure the account remains active as long as other runtime modules or pallets reference it.\n\nWhen all dependencies are cleared and the counters drop to zero, the account becomes deactivated and may be removed from the chain (reaped). This is particularly important in Polkadot SDK-based blockchains, where accounts with balances below the existential deposit threshold are pruned from storage to conserve state resources.\n\nEach pallet that references an account has cleanup functions that decrement these counters when the pallet no longer depends on the account. Once these counters reach zero, the account is marked for deactivation.\n\n#### Updating Counters\n\nThe Polkadot SDK provides runtime developers with various methods to manage account lifecycle events, such as deactivation or incrementing reference counters. These methods ensure that accounts cannot be reaped while still in use.\n\nThe following helper functions manage these counters:\n\n- **`inc_consumers()`**: Increments the `consumer` reference counter for an account, signaling that another pallet depends on it.\n- **`dec_consumers()`**: Decrements the `consumer` reference counter, signaling that a pallet no longer relies on the account.\n- **`inc_providers()`**: Increments the `provider` reference counter, ensuring the account remains active.\n- **`dec_providers()`**: Decrements the `provider` reference counter, allowing for account deactivation when no longer in use.\n- **`inc_sufficients()`**: Increments the `sufficient` reference counter for accounts that hold sufficient assets.\n- **`dec_sufficients()`**: Decrements the `sufficient` reference counter.\n\nTo ensure proper account cleanup and lifecycle management, a corresponding decrement should be made for each increment action.\n\nThe `System` pallet offers three query functions to assist developers in tracking account states:\n\n- **[`can_inc_consumer()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.can_inc_consumer){target=\\_blank}**: Checks if the account can safely increment the consumer reference.\n- **[`can_dec_provider()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.can_dec_provider){target=\\_blank}**: Ensures that no consumers exist before allowing the decrement of the provider counter.\n- **[`is_provider_required()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.is_provider_required){target=\\_blank}**: Verifies whether the account still has any active consumer references.\n\nThis modular and flexible system of reference counters tightly controls the lifecycle of accounts in Polkadot SDK-based blockchains, preventing the accidental removal or retention of unneeded accounts. You can refer to the [System pallet Rust docs](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html){target=\\_blank} for more details."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 5, "depth": 2, "title": "Account Balance Types", "anchor": "account-balance-types", "start_char": 9744, "end_char": 11664, "estimated_token_count": 465, "token_estimator": "heuristic-v1", "text": "## Account Balance Types\n\nIn the Polkadot ecosystem, account balances are categorized into different types based on how the funds are utilized and their availability. These balance types determine the actions that can be performed, such as transferring tokens, paying transaction fees, or participating in governance activities. Understanding these balance types helps developers manage user accounts and implement balance-dependent logic.\n\n!!! note \"A more efficient distribution of account balance types is in development\"\n Soon, pallets in the Polkadot SDK will implement the [`Fungible` trait](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/fungible/index.html){target=\\_blank} (see the [tracking issue](https://github.com/paritytech/polkadot-sdk/issues/226){target=\\_blank} for more details). For example, the [`transaction-storage`](https://paritytech.github.io/polkadot-sdk/master/pallet_transaction_storage/index.html){target=\\_blank} pallet changed the implementation of the [`Currency`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/currency/index.html){target=\\_blank} trait (see the [Refactor transaction storage pallet to use fungible traits](https://github.com/paritytech/polkadot-sdk/pull/1800){target=\\_blank} PR for further details):\n\n ```rust\n type BalanceOf = <::Currency as Currency<::AccountId>>::Balance;\n ```\n \n To the [`Fungible`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/fungible/index.html){target=\\_blank} trait:\n\n ```rust\n type BalanceOf = <::Currency as FnInspect<::AccountId>>::Balance;\n ```\n \n This update will enable more efficient use of account balances, allowing the free balance to be utilized for on-chain activities such as setting proxies and managing identities."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 6, "depth": 3, "title": "Balance Types", "anchor": "balance-types", "start_char": 11664, "end_char": 14139, "estimated_token_count": 601, "token_estimator": "heuristic-v1", "text": "### Balance Types\n\nThe five main balance types are:\n\n- **Free balance**: Represents the total tokens available to the account for any on-chain activity, including staking, governance, and voting. However, it may not be fully spendable or transferrable if portions of it are locked or reserved.\n- **Locked balance**: Portions of the free balance that cannot be spent or transferred because they are tied up in specific activities like [staking](https://wiki.polkadot.com/learn/learn-staking/#nominating-validators){target=\\_blank}, [vesting](https://wiki.polkadot.com/learn/learn-guides-transfers/#vested-transfers-with-the-polkadot-js-ui){target=\\_blank}, or participating in [governance](https://wiki.polkadot.com/learn/learn-polkadot-opengov/#voting-on-a-referendum){target=\\_blank}. While the tokens remain part of the free balance, they are non-transferable for the duration of the lock.\n- **Reserved balance**: Funds locked by specific system actions, such as setting up an [identity](https://wiki.polkadot.com/learn/learn-identity/){target=\\_blank}, creating [proxies](https://wiki.polkadot.com/learn/learn-proxies/){target=\\_blank}, or submitting [deposits for governance proposals](https://wiki.polkadot.com/learn/learn-guides-polkadot-opengov/#claiming-opengov-deposits){target=\\_blank}. These tokens are not part of the free balance and cannot be spent unless they are unreserved.\n- **Spendable balance**: The portion of the free balance that is available for immediate spending or transfers. It is calculated by subtracting the maximum of locked or reserved amounts from the free balance, ensuring that existential deposit limits are met.\n- **Untouchable balance**: Funds that cannot be directly spent or transferred but may still be utilized for on-chain activities, such as governance participation or staking. These tokens are typically tied to certain actions or locked for a specific period.\n\nThe spendable balance is calculated as follows:\n\n```text\nspendable = free - max(locked - reserved, ED)\n```\n\nHere, `free`, `locked`, and `reserved` are defined above. The `ED` represents the [existential deposit](https://wiki.polkadot.com/learn/learn-accounts/#existential-deposit-and-reaping){target=\\_blank}, the minimum balance required to keep an account active and prevent it from being reaped. You may find you can't see all balance types when looking at your account via a wallet. Wallet providers often display only spendable, locked, and reserved balances."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 7, "depth": 3, "title": "Locks", "anchor": "locks", "start_char": 14139, "end_char": 16334, "estimated_token_count": 464, "token_estimator": "heuristic-v1", "text": "### Locks\n\nLocks are applied to an account's free balance, preventing that portion from being spent or transferred. Locks are automatically placed when an account participates in specific on-chain activities, such as staking or governance. Although multiple locks may be applied simultaneously, they do not stack. Instead, the largest lock determines the total amount of locked tokens.\n\nLocks follow these basic rules:\n\n- If different locks apply to varying amounts, the largest lock amount takes precedence.\n- If multiple locks apply to the same amount, the lock with the longest duration governs when the balance can be unlocked.\n\n#### Locks Example\n\nConsider an example where an account has 80 DOT locked for both staking and governance purposes like so:\n\n- 80 DOT is staked with a 28-day lock period.\n- 24 DOT is locked for governance with a 1x conviction and a 7-day lock period.\n- 4 DOT is locked for governance with a 6x conviction and a 224-day lock period.\n\nIn this case, the total locked amount is 80 DOT because only the largest lock (80 DOT from staking) governs the locked balance. These 80 DOT will be released at different times based on the lock durations. In this example, the 24 DOT locked for governance will be released first since the shortest lock period is seven days. The 80 DOT stake with a 28-day lock period is released next. Now, all that remains locked is the 4 DOT for governance. After 224 days, all 80 DOT (minus the existential deposit) will be free and transferable.\n\n![Illustration of Lock Example](/images/polkadot-protocol/parachain-basics/accounts/locks-example-2.webp)\n\n#### Edge Cases for Locks\n\nIn scenarios where multiple convictions and lock periods are active, the lock duration and amount are determined by the longest period and largest amount. For example, if you delegate with different convictions and attempt to undelegate during an active lock period, the lock may be extended for the full amount of tokens. For a detailed discussion on edge case lock behavior, see this [Stack Exchange post](https://substrate.stackexchange.com/questions/5067/delegating-and-undelegating-during-the-lock-period-extends-it-for-the-initial-am){target=\\_blank}."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 8, "depth": 3, "title": "Balance Types on Polkadot.js", "anchor": "balance-types-on-polkadotjs", "start_char": 16334, "end_char": 19399, "estimated_token_count": 611, "token_estimator": "heuristic-v1", "text": "### Balance Types on Polkadot.js\n\nPolkadot.js provides a user-friendly interface for managing and visualizing various account balances on Polkadot and Kusama networks. When interacting with Polkadot.js, you will encounter multiple balance types that are critical for understanding how your funds are distributed and restricted. This section explains how different balances are displayed in the Polkadot.js UI and what each type represents.\n\n![](/images/polkadot-protocol/parachain-basics/accounts/account-balance-types-1.webp)\n\nThe most common balance types displayed on Polkadot.js are:\n\n- **Total balance**: The total number of tokens available in the account. This includes all tokens, whether they are transferable, locked, reserved, or vested. However, the total balance does not always reflect what can be spent immediately. In this example, the total balance is 0.6274 KSM.\n\n- **Transferable balance**: Shows how many tokens are immediately available for transfer. It is calculated by subtracting the locked and reserved balances from the total balance. For example, if an account has a total balance of 0.6274 KSM and a transferable balance of 0.0106 KSM, only the latter amount can be sent or spent freely.\n\n- **Vested balance**: Tokens that allocated to the account but released according to a specific schedule. Vested tokens remain locked and cannot be transferred until fully vested. For example, an account with a vested balance of 0.2500 KSM means that this amount is owned but not yet transferable.\n\n- **Locked balance**: Tokens that are temporarily restricted from being transferred or spent. These locks typically result from participating in staking, governance, or vested transfers. In Polkadot.js, locked balances do not stack—only the largest lock is applied. For instance, if an account has 0.5500 KSM locked for governance and staking, the locked balance would display 0.5500 KSM, not the sum of all locked amounts.\n\n- **Reserved balance**: Refers to tokens locked for specific on-chain actions, such as setting an identity, creating a proxy, or making governance deposits. Reserved tokens are not part of the free balance, but can be freed by performing certain actions. For example, removing an identity would unreserve those funds.\n\n- **Bonded balance**: The tokens locked for staking purposes. Bonded tokens are not transferable until they are unbonded after the unbonding period.\n\n- **Redeemable balance**: The number of tokens that have completed the unbonding period and are ready to be unlocked and transferred again. For example, if an account has a redeemable balance of 0.1000 KSM, those tokens are now available for spending.\n\n- **Democracy balance**: Reflects the number of tokens locked for governance activities, such as voting on referenda. These tokens are locked for the duration of the governance action and are only released after the lock period ends.\n\nBy understanding these balance types and their implications, developers and users can better manage their funds and engage with on-chain activities more effectively."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 9, "depth": 2, "title": "Address Formats", "anchor": "address-formats", "start_char": 19399, "end_char": 19858, "estimated_token_count": 77, "token_estimator": "heuristic-v1", "text": "## Address Formats\n\nThe SS58 address format is a core component of the Polkadot SDK that enables accounts to be uniquely identified across Polkadot-based networks. This format is a modified version of Bitcoin's Base58Check encoding, specifically designed to accommodate the multi-chain nature of the Polkadot ecosystem. SS58 encoding allows each chain to define its own set of addresses while maintaining compatibility and checksum validation for security."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 10, "depth": 3, "title": "Basic Format", "anchor": "basic-format", "start_char": 19858, "end_char": 21100, "estimated_token_count": 295, "token_estimator": "heuristic-v1", "text": "### Basic Format\n\nSS58 addresses consist of three main components:\n\n```text\nbase58encode(concat(,
, ))\n```\n\n- **Address type**: A byte or set of bytes that define the network (or chain) for which the address is intended. This ensures that addresses are unique across different Polkadot SDK-based chains.\n- **Address**: The public key of the account encoded as bytes.\n- **Checksum**: A hash-based checksum which ensures that addresses are valid and unaltered. The checksum is derived from the concatenated address type and address components, ensuring integrity.\n\nThe encoding process transforms the concatenated components into a Base58 string, providing a compact and human-readable format that avoids easily confused characters (e.g., zero '0', capital 'O', lowercase 'l'). This encoding function ([`encode`](https://docs.rs/bs58/latest/bs58/fn.encode.html){target=\\_blank}) is implemented exactly as defined in Bitcoin and IPFS specifications, using the same alphabet as both implementations.\n\nFor more details about the SS58 address format implementation, see the [`Ss58Codec`](https://paritytech.github.io/polkadot-sdk/master/sp_core/crypto/trait.Ss58Codec.html){target=\\_blank} trait in the Rust Docs."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 11, "depth": 3, "title": "Address Type", "anchor": "address-type", "start_char": 21100, "end_char": 22035, "estimated_token_count": 203, "token_estimator": "heuristic-v1", "text": "### Address Type\n\nThe address type defines how an address is interpreted and to which network it belongs. Polkadot SDK uses different prefixes to distinguish between various chains and address formats:\n\n- **Address types `0-63`**: Simple addresses, commonly used for network identifiers.\n- **Address types `64-127`**: Full addresses that support a wider range of network identifiers.\n- **Address types `128-255`**: Reserved for future address format extensions.\n\nFor example, Polkadot’s main network uses an address type of 0, while Kusama uses 2. This ensures that addresses can be used without confusion between networks.\n\nThe address type is always encoded as part of the SS58 address, making it easy to quickly identify the network. Refer to the [SS58 registry](https://github.com/paritytech/ss58-registry){target=\\_blank} for the canonical listing of all address type identifiers and how they map to Polkadot SDK-based networks."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 12, "depth": 3, "title": "Address Length", "anchor": "address-length", "start_char": 22035, "end_char": 23229, "estimated_token_count": 268, "token_estimator": "heuristic-v1", "text": "### Address Length\n\nSS58 addresses can have different lengths depending on the specific format. Address lengths range from as short as 3 to 35 bytes, depending on the complexity of the address and network requirements. This flexibility allows SS58 addresses to adapt to different chains while providing a secure encoding mechanism.\n\n| Total | Type | Raw account | Checksum |\n|-------|------|-------------|----------|\n| 3 | 1 | 1 | 1 |\n| 4 | 1 | 2 | 1 |\n| 5 | 1 | 2 | 2 |\n| 6 | 1 | 4 | 1 |\n| 7 | 1 | 4 | 2 |\n| 8 | 1 | 4 | 3 |\n| 9 | 1 | 4 | 4 |\n| 10 | 1 | 8 | 1 |\n| 11 | 1 | 8 | 2 |\n| 12 | 1 | 8 | 3 |\n| 13 | 1 | 8 | 4 |\n| 14 | 1 | 8 | 5 |\n| 15 | 1 | 8 | 6 |\n| 16 | 1 | 8 | 7 |\n| 17 | 1 | 8 | 8 |\n| 35 | 1 | 32 | 2 |\n\nSS58 addresses also support different payload sizes, allowing a flexible range of account identifiers."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 13, "depth": 3, "title": "Checksum Types", "anchor": "checksum-types", "start_char": 23229, "end_char": 23694, "estimated_token_count": 94, "token_estimator": "heuristic-v1", "text": "### Checksum Types\n\nA checksum is applied to validate SS58 addresses. Polkadot SDK uses a Blake2b-512 hash function to calculate the checksum, which is appended to the address before encoding. The checksum length can vary depending on the address format (e.g., 1-byte, 2-byte, or longer), providing varying levels of validation strength.\n\nThe checksum ensures that an address is not modified or corrupted, adding an extra layer of security for account management."} -{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 14, "depth": 3, "title": "Validating Addresses", "anchor": "validating-addresses", "start_char": 23694, "end_char": 28474, "estimated_token_count": 1074, "token_estimator": "heuristic-v1", "text": "### Validating Addresses\n\nSS58 addresses can be validated using the subkey command-line interface or the Polkadot.js API. These tools help ensure an address is correctly formatted and valid for the intended network. The following sections will provide an overview of how validation works with these tools.\n\n#### Using Subkey\n\n[Subkey](https://paritytech.github.io/polkadot-sdk/master/subkey/index.html){target=\\_blank} is a CLI tool provided by Polkadot SDK for generating and managing keys. It can inspect and validate SS58 addresses.\n\nThe `inspect` command gets a public key and an SS58 address from the provided secret URI. The basic syntax for the `subkey inspect` command is:\n\n```bash\nsubkey inspect [flags] [options] uri\n```\n\nFor the `uri` command-line argument, you can specify the secret seed phrase, a hex-encoded private key, or an SS58 address. If the input is a valid address, the `subkey` program displays the corresponding hex-encoded public key, account identifier, and SS58 addresses.\n\nFor example, to inspect the public keys derived from a secret seed phrase, you can run a command similar to the following:\n\n```bash\nsubkey inspect \"caution juice atom organ advance problem want pledge someone senior holiday very\"\n```\n\nThe command displays output similar to the following:\n\n
\n subkey inspect \"caution juice atom organ advance problem want pledge someone senior holiday very\"\n Secret phrase `caution juice atom organ advance problem want pledge someone senior holiday very` is account:\n Secret seed: 0xc8fa03532fb22ee1f7f6908b9c02b4e72483f0dbd66e4cd456b8f34c6230b849\n Public key (hex): 0xd6a3105d6768e956e9e5d41050ac29843f98561410d3a47f9dd5b3b227ab8746\n Public key (SS58): 5Gv8YYFu8H1btvmrJy9FjjAWfb99wrhV3uhPFoNEr918utyR\n Account ID: 0xd6a3105d6768e956e9e5d41050ac29843f98561410d3a47f9dd5b3b227ab8746\n SS58 Address: 5Gv8YYFu8H1btvmrJy9FjjAWfb99wrhV3uhPFoNEr918utyR\n
\n\nThe `subkey` program assumes an address is based on a public/private key pair. If you inspect an address, the command returns the 32-byte account identifier.\n\nHowever, not all addresses in Polkadot SDK-based networks are based on keys.\n\nDepending on the command-line options you specify and the input you provided, the command output might also display the network for which the address has been encoded. For example:\n\n```bash\nsubkey inspect \"12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\"\n```\n\nThe command displays output similar to the following:\n\n
\n subkey inspect \"12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\"\n Public Key URI `12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU` is account:\n Network ID/Version: polkadot\n Public key (hex): 0x46ebddef8cd9bb167dc30878d7113b7e168e6f0646beffd77d69d39bad76b47a\n Account ID: 0x46ebddef8cd9bb167dc30878d7113b7e168e6f0646beffd77d69d39bad76b47a\n Public key (SS58): 12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\n SS58 Address: 12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\n
\n\n#### Using Polkadot.js API\n\nTo verify an address in JavaScript or TypeScript projects, you can use the functions built into the [Polkadot.js API](https://polkadot.js.org/docs/){target=\\_blank}. For example:\n\n```js\n// Import Polkadot.js API dependencies\nconst { decodeAddress, encodeAddress } = require('@polkadot/keyring');\nconst { hexToU8a, isHex } = require('@polkadot/util');\n\n// Specify an address to test.\nconst address = 'INSERT_ADDRESS_TO_TEST';\n\n// Check address\nconst isValidSubstrateAddress = () => {\n try {\n encodeAddress(isHex(address) ? hexToU8a(address) : decodeAddress(address));\n\n return true;\n } catch (error) {\n return false;\n }\n};\n\n// Query result\nconst isValid = isValidSubstrateAddress();\nconsole.log(isValid);\n\n```\n\nIf the function returns `true`, the specified address is a valid address.\n\n#### Other SS58 Implementations\n\nSupport for encoding and decoding Polkadot SDK SS58 addresses has been implemented in several other languages and libraries.\n\n- **Crystal**: [`wyhaines/base58.cr`](https://github.com/wyhaines/base58.cr){target=\\_blank}\n- **Go**: [`itering/subscan-plugin`](https://github.com/itering/subscan-plugin){target=\\_blank}\n- **Python**: [`polkascan/py-scale-codec`](https://github.com/polkascan/py-scale-codec){target=\\_blank}\n- **TypeScript**: [`subsquid/squid-sdk`](https://github.com/subsquid/squid-sdk){target=\\_blank}"} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 2, "depth": 3, "title": "Account", "anchor": "account", "start_char": 862, "end_char": 3162, "estimated_token_count": 569, "token_estimator": "heuristic-v1", "text": "### Account\n\nThe [`Account` data type](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/type.Account.html){target=\\_blank} is a storage map within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\\_blank} that links an account ID to its corresponding data. This structure is fundamental for mapping account-related information within the chain.\n\nThe code snippet below shows how accounts are defined:\n\n```rs\n /// The full account information for a particular account ID.\n \t#[pallet::storage]\n \t#[pallet::getter(fn account)]\n \tpub type Account = StorageMap<\n \t\t_,\n \t\tBlake2_128Concat,\n \t\tT::AccountId,\n \t\tAccountInfo,\n \t\tValueQuery,\n \t>;\n```\n\nThe preceding code block defines a storage map named `Account`. The `StorageMap` is a type of on-chain storage that maps keys to values. In the `Account` map, the key is an account ID, and the value is the account's information. Here, `T` represents the generic parameter for the runtime configuration, which is defined by the pallet's configuration trait (`Config`).\n\nThe `StorageMap` consists of the following parameters:\n\n- **`_`**: Used in macro expansion and acts as a placeholder for the storage prefix type. Tells the macro to insert the default prefix during expansion.\n- **`Blake2_128Concat`**: The hashing function applied to keys in the storage map.\n- **`T: :AccountId`**: Represents the key type, which corresponds to the account’s unique ID.\n- **`AccountInfo`**: The value type stored in the map. For each account ID, the map stores an `AccountInfo` struct containing:\n\n - **`T::Nonce`**: A nonce for the account, which is incremented with each transaction to ensure transaction uniqueness.\n - **`T: :AccountData`**: Custom account data defined by the runtime configuration, which could include balances, locked funds, or other relevant information.\n \n- **`ValueQuery`**: Defines how queries to the storage map behave when no value is found; returns a default value instead of `None`.\n\nFor a detailed explanation of storage maps, see the [`StorageMap`](https://paritytech.github.io/polkadot-sdk/master/frame_support/storage/types/struct.StorageMap.html){target=\\_blank} entry in the Rust docs."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 3, "depth": 3, "title": "Account Info", "anchor": "account-info", "start_char": 3162, "end_char": 5825, "estimated_token_count": 617, "token_estimator": "heuristic-v1", "text": "### Account Info\n\nThe `AccountInfo` structure is another key element within the [System pallet](https://paritytech.github.io/polkadot-sdk/master/src/frame_system/lib.rs.html){target=\\_blank}, providing more granular details about each account's state. This structure tracks vital data, such as the number of transactions and the account’s relationships with other modules.\n\n```rs\n/// Information of an account.\n#[derive(Clone, Eq, PartialEq, Default, RuntimeDebug, Encode, Decode, TypeInfo, MaxEncodedLen)]\npub struct AccountInfo {\n\t/// The number of transactions this account has sent.\n\tpub nonce: Nonce,\n\t/// The number of other modules that currently depend on this account's existence. The account\n\t/// cannot be reaped until this is zero.\n\tpub consumers: RefCount,\n\t/// The number of other modules that allow this account to exist. The account may not be reaped\n\t/// until this and `sufficients` are both zero.\n\tpub providers: RefCount,\n\t/// The number of modules that allow this account to exist for their own purposes only. The\n\t/// account may not be reaped until this and `providers` are both zero.\n\tpub sufficients: RefCount,\n\t/// The additional data that belongs to this account. Used to store the balance(s) in a lot of\n\t/// chains.\n\tpub data: AccountData,\n}\n```\n\nThe `AccountInfo` structure includes the following components:\n\n- **`nonce`**: Tracks the number of transactions initiated by the account, which ensures transaction uniqueness and prevents replay attacks.\n- **`consumers`**: Counts how many other modules or pallets rely on this account’s existence. The account cannot be removed from the chain (reaped) until this count reaches zero.\n- **`providers`**: Tracks how many modules permit this account’s existence. An account can only be reaped once both `providers` and `sufficients` are zero.\n- **`sufficients`**: Represents the number of modules that allow the account to exist for internal purposes, independent of any other modules.\n- **`AccountData`**: A flexible data structure that can be customized in the runtime configuration, usually containing balances or other user-specific data.\n\nThis structure helps manage an account's state and prevents its premature removal while it is still referenced by other on-chain data or modules. The [`AccountInfo`](https://paritytech.github.io/polkadot-sdk/master/frame_system/struct.AccountInfo.html){target=\\_blank} structure can vary as long as it satisfies the trait bounds defined by the `AccountData` associated type in the [`frame-system::pallet::Config`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/trait.Config.html){target=\\_blank} trait."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 4, "depth": 3, "title": "Account Reference Counters", "anchor": "account-reference-counters", "start_char": 5825, "end_char": 10918, "estimated_token_count": 1040, "token_estimator": "heuristic-v1", "text": "### Account Reference Counters\n\nPolkadot SDK uses reference counters to track an account’s dependencies across different runtime modules. These counters ensure that accounts remain active while data is associated with them.\n\nThe reference counters include:\n\n- **`consumers`**: Prevents account removal while other pallets still rely on the account.\n- **`providers`**: Ensures an account is active before other pallets store data related to it.\n- **`sufficients`**: Indicates the account’s independence, ensuring it can exist even without a native token balance, such as when holding sufficient alternative assets.\n\n#### Providers Reference Counters\n\nThe `providers` counter ensures that an account is ready to be depended upon by other runtime modules. For example, it is incremented when an account has a balance above the existential deposit, which marks the account as active.\n\nThe system requires this reference counter to be greater than zero for the `consumers` counter to be incremented, ensuring the account is stable before any dependencies are added.\n\n#### Consumers Reference Counters\n\nThe `consumers` counter ensures that the account cannot be reaped until all references to it across the runtime have been removed. This check prevents the accidental deletion of accounts that still have active on-chain data.\n\nIt is the user’s responsibility to clear out any data from other runtime modules if they wish to remove their account and reclaim their existential deposit.\n\n#### Sufficients Reference Counter\n\nThe `sufficients` counter tracks accounts that can exist independently without relying on a native account balance. This is useful for accounts holding other types of assets, like tokens, without needing a minimum balance in the native token.\n\nFor instance, the [Assets pallet](https://paritytech.github.io/polkadot-sdk/master/pallet_assets/index.html){target=\\_blank}, may increment this counter for an account holding sufficient tokens.\n\n#### Account Deactivation\n\nIn Polkadot SDK-based chains, an account is deactivated when its reference counters (such as `providers`, `consumers`, and `sufficient`) reach zero. These counters ensure the account remains active as long as other runtime modules or pallets reference it.\n\nWhen all dependencies are cleared and the counters drop to zero, the account becomes deactivated and may be removed from the chain (reaped). This is particularly important in Polkadot SDK-based blockchains, where accounts with balances below the existential deposit threshold are pruned from storage to conserve state resources.\n\nEach pallet that references an account has cleanup functions that decrement these counters when the pallet no longer depends on the account. Once these counters reach zero, the account is marked for deactivation.\n\n#### Updating Counters\n\nThe Polkadot SDK provides runtime developers with various methods to manage account lifecycle events, such as deactivation or incrementing reference counters. These methods ensure that accounts cannot be reaped while still in use.\n\nThe following helper functions manage these counters:\n\n- **`inc_consumers()`**: Increments the `consumer` reference counter for an account, signaling that another pallet depends on it.\n- **`dec_consumers()`**: Decrements the `consumer` reference counter, signaling that a pallet no longer relies on the account.\n- **`inc_providers()`**: Increments the `provider` reference counter, ensuring the account remains active.\n- **`dec_providers()`**: Decrements the `provider` reference counter, allowing for account deactivation when no longer in use.\n- **`inc_sufficients()`**: Increments the `sufficient` reference counter for accounts that hold sufficient assets.\n- **`dec_sufficients()`**: Decrements the `sufficient` reference counter.\n\nTo ensure proper account cleanup and lifecycle management, a corresponding decrement should be made for each increment action.\n\nThe `System` pallet offers three query functions to assist developers in tracking account states:\n\n- **[`can_inc_consumer()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.can_inc_consumer){target=\\_blank}**: Checks if the account can safely increment the consumer reference.\n- **[`can_dec_provider()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.can_dec_provider){target=\\_blank}**: Ensures that no consumers exist before allowing the decrement of the provider counter.\n- **[`is_provider_required()`](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html#method.is_provider_required){target=\\_blank}**: Verifies whether the account still has any active consumer references.\n\nThis modular and flexible system of reference counters tightly controls the lifecycle of accounts in Polkadot SDK-based blockchains, preventing the accidental removal or retention of unneeded accounts. You can refer to the [System pallet Rust docs](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/struct.Pallet.html){target=\\_blank} for more details."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 5, "depth": 2, "title": "Account Balance Types", "anchor": "account-balance-types", "start_char": 10918, "end_char": 12838, "estimated_token_count": 465, "token_estimator": "heuristic-v1", "text": "## Account Balance Types\n\nIn the Polkadot ecosystem, account balances are categorized into different types based on how the funds are utilized and their availability. These balance types determine the actions that can be performed, such as transferring tokens, paying transaction fees, or participating in governance activities. Understanding these balance types helps developers manage user accounts and implement balance-dependent logic.\n\n!!! note \"A more efficient distribution of account balance types is in development\"\n Soon, pallets in the Polkadot SDK will implement the [`Fungible` trait](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/fungible/index.html){target=\\_blank} (see the [tracking issue](https://github.com/paritytech/polkadot-sdk/issues/226){target=\\_blank} for more details). For example, the [`transaction-storage`](https://paritytech.github.io/polkadot-sdk/master/pallet_transaction_storage/index.html){target=\\_blank} pallet changed the implementation of the [`Currency`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/currency/index.html){target=\\_blank} trait (see the [Refactor transaction storage pallet to use fungible traits](https://github.com/paritytech/polkadot-sdk/pull/1800){target=\\_blank} PR for further details):\n\n ```rust\n type BalanceOf = <::Currency as Currency<::AccountId>>::Balance;\n ```\n \n To the [`Fungible`](https://paritytech.github.io/polkadot-sdk/master/frame_support/traits/tokens/fungible/index.html){target=\\_blank} trait:\n\n ```rust\n type BalanceOf = <::Currency as FnInspect<::AccountId>>::Balance;\n ```\n \n This update will enable more efficient use of account balances, allowing the free balance to be utilized for on-chain activities such as setting proxies and managing identities."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 6, "depth": 3, "title": "Balance Types", "anchor": "balance-types", "start_char": 12838, "end_char": 15313, "estimated_token_count": 601, "token_estimator": "heuristic-v1", "text": "### Balance Types\n\nThe five main balance types are:\n\n- **Free balance**: Represents the total tokens available to the account for any on-chain activity, including staking, governance, and voting. However, it may not be fully spendable or transferrable if portions of it are locked or reserved.\n- **Locked balance**: Portions of the free balance that cannot be spent or transferred because they are tied up in specific activities like [staking](https://wiki.polkadot.com/learn/learn-staking/#nominating-validators){target=\\_blank}, [vesting](https://wiki.polkadot.com/learn/learn-guides-transfers/#vested-transfers-with-the-polkadot-js-ui){target=\\_blank}, or participating in [governance](https://wiki.polkadot.com/learn/learn-polkadot-opengov/#voting-on-a-referendum){target=\\_blank}. While the tokens remain part of the free balance, they are non-transferable for the duration of the lock.\n- **Reserved balance**: Funds locked by specific system actions, such as setting up an [identity](https://wiki.polkadot.com/learn/learn-identity/){target=\\_blank}, creating [proxies](https://wiki.polkadot.com/learn/learn-proxies/){target=\\_blank}, or submitting [deposits for governance proposals](https://wiki.polkadot.com/learn/learn-guides-polkadot-opengov/#claiming-opengov-deposits){target=\\_blank}. These tokens are not part of the free balance and cannot be spent unless they are unreserved.\n- **Spendable balance**: The portion of the free balance that is available for immediate spending or transfers. It is calculated by subtracting the maximum of locked or reserved amounts from the free balance, ensuring that existential deposit limits are met.\n- **Untouchable balance**: Funds that cannot be directly spent or transferred but may still be utilized for on-chain activities, such as governance participation or staking. These tokens are typically tied to certain actions or locked for a specific period.\n\nThe spendable balance is calculated as follows:\n\n```text\nspendable = free - max(locked - reserved, ED)\n```\n\nHere, `free`, `locked`, and `reserved` are defined above. The `ED` represents the [existential deposit](https://wiki.polkadot.com/learn/learn-accounts/#existential-deposit-and-reaping){target=\\_blank}, the minimum balance required to keep an account active and prevent it from being reaped. You may find you can't see all balance types when looking at your account via a wallet. Wallet providers often display only spendable, locked, and reserved balances."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 7, "depth": 3, "title": "Locks", "anchor": "locks", "start_char": 15313, "end_char": 17508, "estimated_token_count": 464, "token_estimator": "heuristic-v1", "text": "### Locks\n\nLocks are applied to an account's free balance, preventing that portion from being spent or transferred. Locks are automatically placed when an account participates in specific on-chain activities, such as staking or governance. Although multiple locks may be applied simultaneously, they do not stack. Instead, the largest lock determines the total amount of locked tokens.\n\nLocks follow these basic rules:\n\n- If different locks apply to varying amounts, the largest lock amount takes precedence.\n- If multiple locks apply to the same amount, the lock with the longest duration governs when the balance can be unlocked.\n\n#### Locks Example\n\nConsider an example where an account has 80 DOT locked for both staking and governance purposes like so:\n\n- 80 DOT is staked with a 28-day lock period.\n- 24 DOT is locked for governance with a 1x conviction and a 7-day lock period.\n- 4 DOT is locked for governance with a 6x conviction and a 224-day lock period.\n\nIn this case, the total locked amount is 80 DOT because only the largest lock (80 DOT from staking) governs the locked balance. These 80 DOT will be released at different times based on the lock durations. In this example, the 24 DOT locked for governance will be released first since the shortest lock period is seven days. The 80 DOT stake with a 28-day lock period is released next. Now, all that remains locked is the 4 DOT for governance. After 224 days, all 80 DOT (minus the existential deposit) will be free and transferable.\n\n![Illustration of Lock Example](/images/polkadot-protocol/parachain-basics/accounts/locks-example-2.webp)\n\n#### Edge Cases for Locks\n\nIn scenarios where multiple convictions and lock periods are active, the lock duration and amount are determined by the longest period and largest amount. For example, if you delegate with different convictions and attempt to undelegate during an active lock period, the lock may be extended for the full amount of tokens. For a detailed discussion on edge case lock behavior, see this [Stack Exchange post](https://substrate.stackexchange.com/questions/5067/delegating-and-undelegating-during-the-lock-period-extends-it-for-the-initial-am){target=\\_blank}."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 8, "depth": 3, "title": "Balance Types on Polkadot.js", "anchor": "balance-types-on-polkadotjs", "start_char": 17508, "end_char": 20573, "estimated_token_count": 611, "token_estimator": "heuristic-v1", "text": "### Balance Types on Polkadot.js\n\nPolkadot.js provides a user-friendly interface for managing and visualizing various account balances on Polkadot and Kusama networks. When interacting with Polkadot.js, you will encounter multiple balance types that are critical for understanding how your funds are distributed and restricted. This section explains how different balances are displayed in the Polkadot.js UI and what each type represents.\n\n![](/images/polkadot-protocol/parachain-basics/accounts/account-balance-types-1.webp)\n\nThe most common balance types displayed on Polkadot.js are:\n\n- **Total balance**: The total number of tokens available in the account. This includes all tokens, whether they are transferable, locked, reserved, or vested. However, the total balance does not always reflect what can be spent immediately. In this example, the total balance is 0.6274 KSM.\n\n- **Transferable balance**: Shows how many tokens are immediately available for transfer. It is calculated by subtracting the locked and reserved balances from the total balance. For example, if an account has a total balance of 0.6274 KSM and a transferable balance of 0.0106 KSM, only the latter amount can be sent or spent freely.\n\n- **Vested balance**: Tokens that allocated to the account but released according to a specific schedule. Vested tokens remain locked and cannot be transferred until fully vested. For example, an account with a vested balance of 0.2500 KSM means that this amount is owned but not yet transferable.\n\n- **Locked balance**: Tokens that are temporarily restricted from being transferred or spent. These locks typically result from participating in staking, governance, or vested transfers. In Polkadot.js, locked balances do not stack—only the largest lock is applied. For instance, if an account has 0.5500 KSM locked for governance and staking, the locked balance would display 0.5500 KSM, not the sum of all locked amounts.\n\n- **Reserved balance**: Refers to tokens locked for specific on-chain actions, such as setting an identity, creating a proxy, or making governance deposits. Reserved tokens are not part of the free balance, but can be freed by performing certain actions. For example, removing an identity would unreserve those funds.\n\n- **Bonded balance**: The tokens locked for staking purposes. Bonded tokens are not transferable until they are unbonded after the unbonding period.\n\n- **Redeemable balance**: The number of tokens that have completed the unbonding period and are ready to be unlocked and transferred again. For example, if an account has a redeemable balance of 0.1000 KSM, those tokens are now available for spending.\n\n- **Democracy balance**: Reflects the number of tokens locked for governance activities, such as voting on referenda. These tokens are locked for the duration of the governance action and are only released after the lock period ends.\n\nBy understanding these balance types and their implications, developers and users can better manage their funds and engage with on-chain activities more effectively."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 9, "depth": 2, "title": "Address Formats", "anchor": "address-formats", "start_char": 20573, "end_char": 21032, "estimated_token_count": 77, "token_estimator": "heuristic-v1", "text": "## Address Formats\n\nThe SS58 address format is a core component of the Polkadot SDK that enables accounts to be uniquely identified across Polkadot-based networks. This format is a modified version of Bitcoin's Base58Check encoding, specifically designed to accommodate the multi-chain nature of the Polkadot ecosystem. SS58 encoding allows each chain to define its own set of addresses while maintaining compatibility and checksum validation for security."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 10, "depth": 3, "title": "Basic Format", "anchor": "basic-format", "start_char": 21032, "end_char": 22274, "estimated_token_count": 295, "token_estimator": "heuristic-v1", "text": "### Basic Format\n\nSS58 addresses consist of three main components:\n\n```text\nbase58encode(concat(,
, ))\n```\n\n- **Address type**: A byte or set of bytes that define the network (or chain) for which the address is intended. This ensures that addresses are unique across different Polkadot SDK-based chains.\n- **Address**: The public key of the account encoded as bytes.\n- **Checksum**: A hash-based checksum which ensures that addresses are valid and unaltered. The checksum is derived from the concatenated address type and address components, ensuring integrity.\n\nThe encoding process transforms the concatenated components into a Base58 string, providing a compact and human-readable format that avoids easily confused characters (e.g., zero '0', capital 'O', lowercase 'l'). This encoding function ([`encode`](https://docs.rs/bs58/latest/bs58/fn.encode.html){target=\\_blank}) is implemented exactly as defined in Bitcoin and IPFS specifications, using the same alphabet as both implementations.\n\nFor more details about the SS58 address format implementation, see the [`Ss58Codec`](https://paritytech.github.io/polkadot-sdk/master/sp_core/crypto/trait.Ss58Codec.html){target=\\_blank} trait in the Rust Docs."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 11, "depth": 3, "title": "Address Type", "anchor": "address-type", "start_char": 22274, "end_char": 23209, "estimated_token_count": 203, "token_estimator": "heuristic-v1", "text": "### Address Type\n\nThe address type defines how an address is interpreted and to which network it belongs. Polkadot SDK uses different prefixes to distinguish between various chains and address formats:\n\n- **Address types `0-63`**: Simple addresses, commonly used for network identifiers.\n- **Address types `64-127`**: Full addresses that support a wider range of network identifiers.\n- **Address types `128-255`**: Reserved for future address format extensions.\n\nFor example, Polkadot’s main network uses an address type of 0, while Kusama uses 2. This ensures that addresses can be used without confusion between networks.\n\nThe address type is always encoded as part of the SS58 address, making it easy to quickly identify the network. Refer to the [SS58 registry](https://github.com/paritytech/ss58-registry){target=\\_blank} for the canonical listing of all address type identifiers and how they map to Polkadot SDK-based networks."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 12, "depth": 3, "title": "Address Length", "anchor": "address-length", "start_char": 23209, "end_char": 24403, "estimated_token_count": 268, "token_estimator": "heuristic-v1", "text": "### Address Length\n\nSS58 addresses can have different lengths depending on the specific format. Address lengths range from as short as 3 to 35 bytes, depending on the complexity of the address and network requirements. This flexibility allows SS58 addresses to adapt to different chains while providing a secure encoding mechanism.\n\n| Total | Type | Raw account | Checksum |\n|-------|------|-------------|----------|\n| 3 | 1 | 1 | 1 |\n| 4 | 1 | 2 | 1 |\n| 5 | 1 | 2 | 2 |\n| 6 | 1 | 4 | 1 |\n| 7 | 1 | 4 | 2 |\n| 8 | 1 | 4 | 3 |\n| 9 | 1 | 4 | 4 |\n| 10 | 1 | 8 | 1 |\n| 11 | 1 | 8 | 2 |\n| 12 | 1 | 8 | 3 |\n| 13 | 1 | 8 | 4 |\n| 14 | 1 | 8 | 5 |\n| 15 | 1 | 8 | 6 |\n| 16 | 1 | 8 | 7 |\n| 17 | 1 | 8 | 8 |\n| 35 | 1 | 32 | 2 |\n\nSS58 addresses also support different payload sizes, allowing a flexible range of account identifiers."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 13, "depth": 3, "title": "Checksum Types", "anchor": "checksum-types", "start_char": 24403, "end_char": 24868, "estimated_token_count": 94, "token_estimator": "heuristic-v1", "text": "### Checksum Types\n\nA checksum is applied to validate SS58 addresses. Polkadot SDK uses a Blake2b-512 hash function to calculate the checksum, which is appended to the address before encoding. The checksum length can vary depending on the address format (e.g., 1-byte, 2-byte, or longer), providing varying levels of validation strength.\n\nThe checksum ensures that an address is not modified or corrupted, adding an extra layer of security for account management."} +{"page_id": "polkadot-protocol-parachain-basics-accounts", "page_title": "Polkadot SDK Accounts", "index": 14, "depth": 3, "title": "Validating Addresses", "anchor": "validating-addresses", "start_char": 24868, "end_char": 29648, "estimated_token_count": 1074, "token_estimator": "heuristic-v1", "text": "### Validating Addresses\n\nSS58 addresses can be validated using the subkey command-line interface or the Polkadot.js API. These tools help ensure an address is correctly formatted and valid for the intended network. The following sections will provide an overview of how validation works with these tools.\n\n#### Using Subkey\n\n[Subkey](https://paritytech.github.io/polkadot-sdk/master/subkey/index.html){target=\\_blank} is a CLI tool provided by Polkadot SDK for generating and managing keys. It can inspect and validate SS58 addresses.\n\nThe `inspect` command gets a public key and an SS58 address from the provided secret URI. The basic syntax for the `subkey inspect` command is:\n\n```bash\nsubkey inspect [flags] [options] uri\n```\n\nFor the `uri` command-line argument, you can specify the secret seed phrase, a hex-encoded private key, or an SS58 address. If the input is a valid address, the `subkey` program displays the corresponding hex-encoded public key, account identifier, and SS58 addresses.\n\nFor example, to inspect the public keys derived from a secret seed phrase, you can run a command similar to the following:\n\n```bash\nsubkey inspect \"caution juice atom organ advance problem want pledge someone senior holiday very\"\n```\n\nThe command displays output similar to the following:\n\n
\n subkey inspect \"caution juice atom organ advance problem want pledge someone senior holiday very\"\n Secret phrase `caution juice atom organ advance problem want pledge someone senior holiday very` is account:\n Secret seed: 0xc8fa03532fb22ee1f7f6908b9c02b4e72483f0dbd66e4cd456b8f34c6230b849\n Public key (hex): 0xd6a3105d6768e956e9e5d41050ac29843f98561410d3a47f9dd5b3b227ab8746\n Public key (SS58): 5Gv8YYFu8H1btvmrJy9FjjAWfb99wrhV3uhPFoNEr918utyR\n Account ID: 0xd6a3105d6768e956e9e5d41050ac29843f98561410d3a47f9dd5b3b227ab8746\n SS58 Address: 5Gv8YYFu8H1btvmrJy9FjjAWfb99wrhV3uhPFoNEr918utyR\n
\n\nThe `subkey` program assumes an address is based on a public/private key pair. If you inspect an address, the command returns the 32-byte account identifier.\n\nHowever, not all addresses in Polkadot SDK-based networks are based on keys.\n\nDepending on the command-line options you specify and the input you provided, the command output might also display the network for which the address has been encoded. For example:\n\n```bash\nsubkey inspect \"12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\"\n```\n\nThe command displays output similar to the following:\n\n
\n subkey inspect \"12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\"\n Public Key URI `12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU` is account:\n Network ID/Version: polkadot\n Public key (hex): 0x46ebddef8cd9bb167dc30878d7113b7e168e6f0646beffd77d69d39bad76b47a\n Account ID: 0x46ebddef8cd9bb167dc30878d7113b7e168e6f0646beffd77d69d39bad76b47a\n Public key (SS58): 12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\n SS58 Address: 12bzRJfh7arnnfPPUZHeJUaE62QLEwhK48QnH9LXeK2m1iZU\n
\n\n#### Using Polkadot.js API\n\nTo verify an address in JavaScript or TypeScript projects, you can use the functions built into the [Polkadot.js API](https://polkadot.js.org/docs/){target=\\_blank}. For example:\n\n```js\n// Import Polkadot.js API dependencies\nconst { decodeAddress, encodeAddress } = require('@polkadot/keyring');\nconst { hexToU8a, isHex } = require('@polkadot/util');\n\n// Specify an address to test.\nconst address = 'INSERT_ADDRESS_TO_TEST';\n\n// Check address\nconst isValidSubstrateAddress = () => {\n try {\n encodeAddress(isHex(address) ? hexToU8a(address) : decodeAddress(address));\n\n return true;\n } catch (error) {\n return false;\n }\n};\n\n// Query result\nconst isValid = isValidSubstrateAddress();\nconsole.log(isValid);\n\n```\n\nIf the function returns `true`, the specified address is a valid address.\n\n#### Other SS58 Implementations\n\nSupport for encoding and decoding Polkadot SDK SS58 addresses has been implemented in several other languages and libraries.\n\n- **Crystal**: [`wyhaines/base58.cr`](https://github.com/wyhaines/base58.cr){target=\\_blank}\n- **Go**: [`itering/subscan-plugin`](https://github.com/itering/subscan-plugin){target=\\_blank}\n- **Python**: [`polkascan/py-scale-codec`](https://github.com/polkascan/py-scale-codec){target=\\_blank}\n- **TypeScript**: [`subsquid/squid-sdk`](https://github.com/subsquid/squid-sdk){target=\\_blank}"} {"page_id": "polkadot-protocol-parachain-basics-blocks-transactions-fees-blocks", "page_title": "Blocks", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 10, "end_char": 707, "estimated_token_count": 130, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIn the Polkadot SDK, blocks are fundamental to the functioning of the blockchain, serving as containers for [transactions](/polkadot-protocol/parachain-basics/blocks-transactions-fees/transactions/){target=\\_blank} and changes to the chain's state. Blocks consist of headers and an array of transactions, ensuring the integrity and validity of operations on the network. This guide explores the essential components of a block, the process of block production, and how blocks are validated and imported across the network. By understanding these concepts, developers can better grasp how blockchains maintain security, consistency, and performance within the Polkadot ecosystem."} {"page_id": "polkadot-protocol-parachain-basics-blocks-transactions-fees-blocks", "page_title": "Blocks", "index": 1, "depth": 2, "title": "What is a Block?", "anchor": "what-is-a-block", "start_char": 707, "end_char": 1844, "estimated_token_count": 226, "token_estimator": "heuristic-v1", "text": "## What is a Block?\n\nIn the Polkadot SDK, a block is a fundamental unit that encapsulates both the header and an array of transactions. The block header includes critical metadata to ensure the integrity and sequence of the blockchain. Here's a breakdown of its components:\n\n- **Block height**: Indicates the number of blocks created in the chain so far.\n- **Parent hash**: The hash of the previous block, providing a link to maintain the blockchain's immutability.\n- **Transaction root**: Cryptographic digest summarizing all transactions in the block.\n- **State root**: A cryptographic digest representing the post-execution state.\n- **Digest**: Additional information that can be attached to a block, such as consensus-related messages.\n\nEach transaction is part of a series that is executed according to the runtime's rules. The transaction root is a cryptographic digest of this series, which prevents alterations and enables succinct verification by light clients. This verification process allows light clients to confirm whether a transaction exists in a block with only the block header, avoiding downloading the entire block."} {"page_id": "polkadot-protocol-parachain-basics-blocks-transactions-fees-blocks", "page_title": "Blocks", "index": 2, "depth": 2, "title": "Block Production", "anchor": "block-production", "start_char": 1844, "end_char": 2168, "estimated_token_count": 57, "token_estimator": "heuristic-v1", "text": "## Block Production\n\nWhen an authoring node is authorized to create a new block, it selects transactions from the transaction queue based on priority. This step, known as block production, relies heavily on the executive module to manage the initialization and finalization of blocks. The process is summarized as follows:"} @@ -1315,11 +1315,11 @@ {"page_id": "tutorials-onchain-governance", "page_title": "On-Chain Governance Tutorials", "index": 0, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 405, "end_char": 455, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "tutorials-onchain-governance", "page_title": "On-Chain Governance Tutorials", "index": 1, "depth": 2, "title": "Additional Resources", "anchor": "additional-resources", "start_char": 455, "end_char": 872, "estimated_token_count": 114, "token_estimator": "heuristic-v1", "text": "## Additional Resources\n\n"} {"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 30, "end_char": 866, "estimated_token_count": 192, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIn previous tutorials, you learned how to [create a custom pallet](/tutorials/polkadot-sdk/parachains/zero-to-hero/build-custom-pallet/){target=\\_blank} and [test it](/tutorials/polkadot-sdk/parachains/zero-to-hero/pallet-unit-testing/){target=\\_blank}. The next step is to include this pallet in your runtime, integrating it into the core logic of your blockchain.\n\nThis tutorial will guide you through adding two pallets to your runtime: the custom pallet you previously developed and the [utility pallet](https://paritytech.github.io/polkadot-sdk/master/pallet_utility/index.html){target=\\_blank}. This standard Polkadot SDK pallet provides powerful dispatch functionality. The utility pallet offers, for example, batch dispatch, a stateless operation that enables executing multiple calls in a single transaction."} -{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 1, "depth": 2, "title": "Add the Pallets as Dependencies", "anchor": "add-the-pallets-as-dependencies", "start_char": 866, "end_char": 6394, "estimated_token_count": 1262, "token_estimator": "heuristic-v1", "text": "## Add the Pallets as Dependencies\n\nFirst, you'll update the runtime's `Cargo.toml` file to include the Utility pallet and your custom pallets as dependencies for the runtime. Follow these steps:\n\n1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line:\n\n ```toml hl_lines=\"4\" title=\"runtime/Cargo.toml\"\n \n ...\n polkadot-sdk = { workspace = true, features = [\n \"pallet-utility\",\n ...\n ], default-features = false }\n ```\n\n2. In the same `[dependencies]` section, add the custom pallet that you built from scratch with the following line:\n\n ```toml hl_lines=\"3\" title=\"Cargo.toml\"\n [dependencies]\n ...\n custom-pallet = { path = \"../pallets/custom-pallet\", default-features = false }\n ```\n\n3. In the `[features]` section, add the custom pallet to the `std` feature list:\n\n ```toml hl_lines=\"5\" title=\"Cargo.toml\"\n [features]\n default = [\"std\"]\n std = [\n ...\n \"custom-pallet/std\",\n ...\n ]\n ```\n\n3. Save the changes and close the `Cargo.toml` file.\n\n Once you have saved your file, it should look like the following:\n\n ???- code \"runtime/Cargo.toml\"\n \n ```rust title=\"runtime/Cargo.toml\"\n [package]\n name = \"parachain-template-runtime\"\n description = \"A parachain runtime template built with Substrate and Cumulus, part of Polkadot Sdk.\"\n version = \"0.1.0\"\n license = \"Unlicense\"\n authors.workspace = true\n homepage.workspace = true\n repository.workspace = true\n edition.workspace = true\n publish = false\n\n [package.metadata.docs.rs]\n targets = [\"x86_64-unknown-linux-gnu\"]\n\n [build-dependencies]\n docify = { workspace = true }\n substrate-wasm-builder = { optional = true, workspace = true, default-features = true }\n\n [dependencies]\n codec = { features = [\"derive\"], workspace = true }\n cumulus-pallet-parachain-system.workspace = true\n docify = { workspace = true }\n hex-literal = { optional = true, workspace = true, default-features = true }\n log = { workspace = true }\n pallet-parachain-template = { path = \"../pallets/template\", default-features = false }\n polkadot-sdk = { workspace = true, features = [\n \"pallet-utility\",\n \"cumulus-pallet-aura-ext\",\n \"cumulus-pallet-session-benchmarking\",\n \"cumulus-pallet-weight-reclaim\",\n \"cumulus-pallet-xcm\",\n \"cumulus-pallet-xcmp-queue\",\n \"cumulus-primitives-aura\",\n \"cumulus-primitives-core\",\n \"cumulus-primitives-utility\",\n \"pallet-aura\",\n \"pallet-authorship\",\n \"pallet-balances\",\n \"pallet-collator-selection\",\n \"pallet-message-queue\",\n \"pallet-session\",\n \"pallet-sudo\",\n \"pallet-timestamp\",\n \"pallet-transaction-payment\",\n \"pallet-transaction-payment-rpc-runtime-api\",\n \"pallet-xcm\",\n \"parachains-common\",\n \"polkadot-parachain-primitives\",\n \"polkadot-runtime-common\",\n \"runtime\",\n \"staging-parachain-info\",\n \"staging-xcm\",\n \"staging-xcm-builder\",\n \"staging-xcm-executor\",\n ], default-features = false }\n scale-info = { features = [\"derive\"], workspace = true }\n serde_json = { workspace = true, default-features = false, features = [\n \"alloc\",\n ] }\n smallvec = { workspace = true, default-features = true }\n\n custom-pallet = { path = \"../pallets/custom-pallet\", default-features = false }\n\n [features]\n default = [\"std\"]\n std = [\n \"codec/std\",\n \"cumulus-pallet-parachain-system/std\",\n \"log/std\",\n \"pallet-parachain-template/std\",\n \"polkadot-sdk/std\",\n \"scale-info/std\",\n \"serde_json/std\",\n \"substrate-wasm-builder\",\n \"custom-pallet/std\",\n ]\n\n runtime-benchmarks = [\n \"cumulus-pallet-parachain-system/runtime-benchmarks\",\n \"hex-literal\",\n \"pallet-parachain-template/runtime-benchmarks\",\n \"polkadot-sdk/runtime-benchmarks\",\n ]\n\n try-runtime = [\n \"cumulus-pallet-parachain-system/try-runtime\",\n \"pallet-parachain-template/try-runtime\",\n \"polkadot-sdk/try-runtime\",\n ]\n\n # Enable the metadata hash generation.\n #\n # This is hidden behind a feature because it increases the compile time.\n # The wasm binary needs to be compiled twice, once to fetch the metadata,\n # generate the metadata hash and then a second time with the\n # `RUNTIME_METADATA_HASH` environment variable set for the `CheckMetadataHash`\n # extension.\n metadata-hash = [\"substrate-wasm-builder/metadata-hash\"]\n\n # A convenience feature for enabling things when doing a build\n # for an on-chain release.\n on-chain-release-build = [\"metadata-hash\"]\n\n ```\n\nUpdate your root parachain template's `Cargo.toml` file to include your custom pallet as a dependency. Follow these steps:\n\n1. Open the `./Cargo.toml` file and locate the `[workspace]` section. \n \n Make sure the `custom-pallet` is a member of the workspace:\n\n ```toml hl_lines=\"4\" title=\"Cargo.toml\"\n \n ```\n\n???- code \"./Cargo.toml\"\n\n ```rust title=\"./Cargo.toml\"\n \n ```"} -{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 2, "depth": 3, "title": "Update the Runtime Configuration", "anchor": "update-the-runtime-configuration", "start_char": 6394, "end_char": 8299, "estimated_token_count": 406, "token_estimator": "heuristic-v1", "text": "### Update the Runtime Configuration\n\nConfigure the pallets by implementing their `Config` trait and update the runtime macro to include the new pallets:\n\n1. Add the `OriginCaller` import:\n\n ```rust title=\"mod.rs\" hl_lines=\"8\"\n // Local module imports\n use super::OriginCaller;\n ...\n ```\n\n2. Implement the [`Config`](https://paritytech.github.io/polkadot-sdk/master/pallet_utility/pallet/trait.Config.html){target=\\_blank} trait for both pallets at the end of the `runtime/src/config/mod.rs` file:\n\n ```rust title=\"mod.rs\" hl_lines=\"8-25\"\n ...\n /// Configure the pallet template in pallets/template.\n impl pallet_parachain_template::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type WeightInfo = pallet_parachain_template::weights::SubstrateWeight;\n }\n\n // Configure utility pallet.\n impl pallet_utility::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type RuntimeCall = RuntimeCall;\n type PalletsOrigin = OriginCaller;\n type WeightInfo = pallet_utility::weights::SubstrateWeight;\n }\n // Define counter max value runtime constant.\n parameter_types! {\n pub const CounterMaxValue: u32 = 500;\n }\n\n // Configure custom pallet.\n impl custom_pallet::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type CounterMaxValue = CounterMaxValue;\n }\n ```\n\n3. Locate the `#[frame_support::runtime]` macro in the `runtime/src/lib.rs` file and add the pallets:\n\n ```rust hl_lines=\"9-14\" title=\"lib.rs\"\n #[frame_support::runtime]\n mod runtime {\n #[runtime::runtime]\n #[runtime::derive(\n ...\n )]\n pub struct Runtime;\n #[runtime::pallet_index(51)]\n pub type Utility = pallet_utility;\n\n #[runtime::pallet_index(52)]\n pub type CustomPallet = custom_pallet;\n }\n ```"} -{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 3, "depth": 2, "title": "Recompile the Runtime", "anchor": "recompile-the-runtime", "start_char": 8299, "end_char": 8748, "estimated_token_count": 89, "token_estimator": "heuristic-v1", "text": "## Recompile the Runtime\n\nAfter adding and configuring your pallets in the runtime, the next step is to ensure everything is set up correctly. To do this, recompile the runtime with the following command (make sure you're in the project's root directory):\n\n```bash\ncargo build --release\n```\n\nThis command ensures the runtime compiles without errors, validates the pallet configurations, and prepares the build for subsequent testing or deployment."} -{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 4, "depth": 2, "title": "Run Your Chain Locally", "anchor": "run-your-chain-locally", "start_char": 8748, "end_char": 10221, "estimated_token_count": 373, "token_estimator": "heuristic-v1", "text": "## Run Your Chain Locally\n\nLaunch your parachain locally and start producing blocks:\n\n!!!tip\n Generated chain TestNet specifications include development accounts \"Alice\" and \"Bob.\" These accounts are pre-funded with native parachain currency, allowing you to sign and send TestNet transactions. Take a look at the [Polkadot.js Accounts section](https://polkadot.js.org/apps/#/accounts){target=\\_blank} to view the development accounts for your chain.\n\n1. Create a new chain specification file with the updated runtime:\n\n ```bash\n chain-spec-builder create -t development \\\n --relay-chain paseo \\\n --para-id 1000 \\\n --runtime ./target/release/wbuild/parachain-template-runtime/parachain_template_runtime.compact.compressed.wasm \\\n named-preset development\n ```\n\n2. Start the omni node with the generated chain specification:\n\n ```bash\n polkadot-omni-node --chain ./chain_spec.json --dev\n ```\n\n3. Verify you can interact with the new pallets using the [Polkadot.js Apps](https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/extrinsics){target=\\_blank} interface. Navigate to the **Extrinsics** tab and check that you can see both pallets:\n\n - Utility pallet\n\n ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/add-pallets-to-runtime/add-pallets-to-runtime-1.webp)\n \n\n - Custom pallet\n\n ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/add-pallets-to-runtime/add-pallets-to-runtime-2.webp)"} -{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 5, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 10221, "end_char": 10973, "estimated_token_count": 183, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n
\n\n- Tutorial __Deploy on Paseo TestNet__\n\n ---\n\n Deploy your Polkadot SDK blockchain on Paseo! Follow this step-by-step guide for a seamless journey to a successful TestNet deployment.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/deploy-to-testnet/)\n\n- Tutorial __Pallet Benchmarking (Optional)__\n\n ---\n\n Discover how to measure extrinsic costs and assign precise weights to optimize your pallet for accurate fees and runtime performance.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/pallet-benchmarking/)\n\n
"} +{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 1, "depth": 2, "title": "Add the Pallets as Dependencies", "anchor": "add-the-pallets-as-dependencies", "start_char": 866, "end_char": 8446, "estimated_token_count": 1835, "token_estimator": "heuristic-v1", "text": "## Add the Pallets as Dependencies\n\nFirst, you'll update the runtime's `Cargo.toml` file to include the Utility pallet and your custom pallets as dependencies for the runtime. Follow these steps:\n\n1. Open the `runtime/Cargo.toml` file and locate the `[dependencies]` section. Add pallet-utility as one of the features for the `polkadot-sdk` dependency with the following line:\n\n ```toml hl_lines=\"4\" title=\"runtime/Cargo.toml\"\n [dependencies]\n ...\n polkadot-sdk = { workspace = true, features = [\n \"pallet-utility\",\n ...\n ], default-features = false }\n ```\n\n2. In the same `[dependencies]` section, add the custom pallet that you built from scratch with the following line:\n\n ```toml hl_lines=\"3\" title=\"Cargo.toml\"\n [dependencies]\n ...\n custom-pallet = { path = \"../pallets/custom-pallet\", default-features = false }\n ```\n\n3. In the `[features]` section, add the custom pallet to the `std` feature list:\n\n ```toml hl_lines=\"5\" title=\"Cargo.toml\"\n \n ...\n \n ...\n ]\n ```\n\n3. Save the changes and close the `Cargo.toml` file.\n\n Once you have saved your file, it should look like the following:\n\n ???- code \"runtime/Cargo.toml\"\n \n ```rust title=\"runtime/Cargo.toml\"\n [package]\n name = \"parachain-template-runtime\"\n description = \"A parachain runtime template built with Substrate and Cumulus, part of Polkadot Sdk.\"\n version = \"0.1.0\"\n license = \"Unlicense\"\n authors.workspace = true\n homepage.workspace = true\n repository.workspace = true\n edition.workspace = true\n publish = false\n\n [package.metadata.docs.rs]\n targets = [\"x86_64-unknown-linux-gnu\"]\n\n [build-dependencies]\n docify = { workspace = true }\n substrate-wasm-builder = { optional = true, workspace = true, default-features = true }\n\n [dependencies]\n codec = { features = [\"derive\"], workspace = true }\n cumulus-pallet-parachain-system.workspace = true\n docify = { workspace = true }\n hex-literal = { optional = true, workspace = true, default-features = true }\n log = { workspace = true }\n pallet-parachain-template = { path = \"../pallets/template\", default-features = false }\n polkadot-sdk = { workspace = true, features = [\n \"pallet-utility\",\n \"cumulus-pallet-aura-ext\",\n \"cumulus-pallet-session-benchmarking\",\n \"cumulus-pallet-weight-reclaim\",\n \"cumulus-pallet-xcm\",\n \"cumulus-pallet-xcmp-queue\",\n \"cumulus-primitives-aura\",\n \"cumulus-primitives-core\",\n \"cumulus-primitives-utility\",\n \"pallet-aura\",\n \"pallet-authorship\",\n \"pallet-balances\",\n \"pallet-collator-selection\",\n \"pallet-message-queue\",\n \"pallet-session\",\n \"pallet-sudo\",\n \"pallet-timestamp\",\n \"pallet-transaction-payment\",\n \"pallet-transaction-payment-rpc-runtime-api\",\n \"pallet-xcm\",\n \"parachains-common\",\n \"polkadot-parachain-primitives\",\n \"polkadot-runtime-common\",\n \"runtime\",\n \"staging-parachain-info\",\n \"staging-xcm\",\n \"staging-xcm-builder\",\n \"staging-xcm-executor\",\n ], default-features = false }\n scale-info = { features = [\"derive\"], workspace = true }\n serde_json = { workspace = true, default-features = false, features = [\n \"alloc\",\n ] }\n smallvec = { workspace = true, default-features = true }\n\n custom-pallet = { path = \"../pallets/custom-pallet\", default-features = false }\n\n [features]\n default = [\"std\"]\n std = [\n \"codec/std\",\n \"cumulus-pallet-parachain-system/std\",\n \"log/std\",\n \"pallet-parachain-template/std\",\n \"polkadot-sdk/std\",\n \"scale-info/std\",\n \"serde_json/std\",\n \"substrate-wasm-builder\",\n \"custom-pallet/std\",\n ]\n\n runtime-benchmarks = [\n \"cumulus-pallet-parachain-system/runtime-benchmarks\",\n \"hex-literal\",\n \"pallet-parachain-template/runtime-benchmarks\",\n \"polkadot-sdk/runtime-benchmarks\",\n ]\n\n try-runtime = [\n \"cumulus-pallet-parachain-system/try-runtime\",\n \"pallet-parachain-template/try-runtime\",\n \"polkadot-sdk/try-runtime\",\n ]\n\n # Enable the metadata hash generation.\n #\n # This is hidden behind a feature because it increases the compile time.\n # The wasm binary needs to be compiled twice, once to fetch the metadata,\n # generate the metadata hash and then a second time with the\n # `RUNTIME_METADATA_HASH` environment variable set for the `CheckMetadataHash`\n # extension.\n metadata-hash = [\"substrate-wasm-builder/metadata-hash\"]\n\n # A convenience feature for enabling things when doing a build\n # for an on-chain release.\n on-chain-release-build = [\"metadata-hash\"]\n\n ```\n\nUpdate your root parachain template's `Cargo.toml` file to include your custom pallet as a dependency. Follow these steps:\n\n1. Open the `./Cargo.toml` file and locate the `[workspace]` section. \n \n Make sure the `custom-pallet` is a member of the workspace:\n\n ```toml hl_lines=\"4\" title=\"Cargo.toml\"\n [workspace]\n default-members = [\"pallets/template\", \"runtime\"]\n members = [\n \"node\", \"pallets/custom-pallet\",\n \"pallets/template\",\n \"runtime\",\n ]\n ```\n\n???- code \"./Cargo.toml\"\n\n ```rust title=\"./Cargo.toml\"\n [workspace.package]\n license = \"MIT-0\"\n authors = [\"Parity Technologies \"]\n homepage = \"https://paritytech.github.io/polkadot-sdk/\"\n repository = \"https://github.com/paritytech/polkadot-sdk-parachain-template.git\"\n edition = \"2021\"\n\n [workspace]\n default-members = [\"pallets/template\", \"runtime\"]\n members = [\n \"node\", \"pallets/custom-pallet\",\n \"pallets/template\",\n \"runtime\",\n ]\n resolver = \"2\"\n\n [workspace.dependencies]\n parachain-template-runtime = { path = \"./runtime\", default-features = false }\n pallet-parachain-template = { path = \"./pallets/template\", default-features = false }\n clap = { version = \"4.5.13\" }\n color-print = { version = \"0.3.4\" }\n docify = { version = \"0.2.9\" }\n futures = { version = \"0.3.31\" }\n jsonrpsee = { version = \"0.24.3\" }\n log = { version = \"0.4.22\", default-features = false }\n polkadot-sdk = { version = \"2503.0.1\", default-features = false }\n prometheus-endpoint = { version = \"0.17.2\", default-features = false, package = \"substrate-prometheus-endpoint\" }\n serde = { version = \"1.0.214\", default-features = false }\n codec = { version = \"3.7.4\", default-features = false, package = \"parity-scale-codec\" }\n cumulus-pallet-parachain-system = { version = \"0.20.0\", default-features = false }\n hex-literal = { version = \"0.4.1\", default-features = false }\n scale-info = { version = \"2.11.6\", default-features = false }\n serde_json = { version = \"1.0.132\", default-features = false }\n smallvec = { version = \"1.11.0\", default-features = false }\n substrate-wasm-builder = { version = \"26.0.1\", default-features = false }\n frame = { version = \"0.9.1\", default-features = false, package = \"polkadot-sdk-frame\" }\n\n [profile.release]\n opt-level = 3\n panic = \"unwind\"\n\n [profile.production]\n codegen-units = 1\n inherits = \"release\"\n lto = true\n ```"} +{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 2, "depth": 3, "title": "Update the Runtime Configuration", "anchor": "update-the-runtime-configuration", "start_char": 8446, "end_char": 10351, "estimated_token_count": 406, "token_estimator": "heuristic-v1", "text": "### Update the Runtime Configuration\n\nConfigure the pallets by implementing their `Config` trait and update the runtime macro to include the new pallets:\n\n1. Add the `OriginCaller` import:\n\n ```rust title=\"mod.rs\" hl_lines=\"8\"\n // Local module imports\n use super::OriginCaller;\n ...\n ```\n\n2. Implement the [`Config`](https://paritytech.github.io/polkadot-sdk/master/pallet_utility/pallet/trait.Config.html){target=\\_blank} trait for both pallets at the end of the `runtime/src/config/mod.rs` file:\n\n ```rust title=\"mod.rs\" hl_lines=\"8-25\"\n ...\n /// Configure the pallet template in pallets/template.\n impl pallet_parachain_template::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type WeightInfo = pallet_parachain_template::weights::SubstrateWeight;\n }\n\n // Configure utility pallet.\n impl pallet_utility::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type RuntimeCall = RuntimeCall;\n type PalletsOrigin = OriginCaller;\n type WeightInfo = pallet_utility::weights::SubstrateWeight;\n }\n // Define counter max value runtime constant.\n parameter_types! {\n pub const CounterMaxValue: u32 = 500;\n }\n\n // Configure custom pallet.\n impl custom_pallet::Config for Runtime {\n type RuntimeEvent = RuntimeEvent;\n type CounterMaxValue = CounterMaxValue;\n }\n ```\n\n3. Locate the `#[frame_support::runtime]` macro in the `runtime/src/lib.rs` file and add the pallets:\n\n ```rust hl_lines=\"9-14\" title=\"lib.rs\"\n #[frame_support::runtime]\n mod runtime {\n #[runtime::runtime]\n #[runtime::derive(\n ...\n )]\n pub struct Runtime;\n #[runtime::pallet_index(51)]\n pub type Utility = pallet_utility;\n\n #[runtime::pallet_index(52)]\n pub type CustomPallet = custom_pallet;\n }\n ```"} +{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 3, "depth": 2, "title": "Recompile the Runtime", "anchor": "recompile-the-runtime", "start_char": 10351, "end_char": 10800, "estimated_token_count": 89, "token_estimator": "heuristic-v1", "text": "## Recompile the Runtime\n\nAfter adding and configuring your pallets in the runtime, the next step is to ensure everything is set up correctly. To do this, recompile the runtime with the following command (make sure you're in the project's root directory):\n\n```bash\ncargo build --release\n```\n\nThis command ensures the runtime compiles without errors, validates the pallet configurations, and prepares the build for subsequent testing or deployment."} +{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 4, "depth": 2, "title": "Run Your Chain Locally", "anchor": "run-your-chain-locally", "start_char": 10800, "end_char": 12273, "estimated_token_count": 373, "token_estimator": "heuristic-v1", "text": "## Run Your Chain Locally\n\nLaunch your parachain locally and start producing blocks:\n\n!!!tip\n Generated chain TestNet specifications include development accounts \"Alice\" and \"Bob.\" These accounts are pre-funded with native parachain currency, allowing you to sign and send TestNet transactions. Take a look at the [Polkadot.js Accounts section](https://polkadot.js.org/apps/#/accounts){target=\\_blank} to view the development accounts for your chain.\n\n1. Create a new chain specification file with the updated runtime:\n\n ```bash\n chain-spec-builder create -t development \\\n --relay-chain paseo \\\n --para-id 1000 \\\n --runtime ./target/release/wbuild/parachain-template-runtime/parachain_template_runtime.compact.compressed.wasm \\\n named-preset development\n ```\n\n2. Start the omni node with the generated chain specification:\n\n ```bash\n polkadot-omni-node --chain ./chain_spec.json --dev\n ```\n\n3. Verify you can interact with the new pallets using the [Polkadot.js Apps](https://polkadot.js.org/apps/?rpc=ws%3A%2F%2F127.0.0.1%3A9944#/extrinsics){target=\\_blank} interface. Navigate to the **Extrinsics** tab and check that you can see both pallets:\n\n - Utility pallet\n\n ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/add-pallets-to-runtime/add-pallets-to-runtime-1.webp)\n \n\n - Custom pallet\n\n ![](/images/tutorials/polkadot-sdk/parachains/zero-to-hero/add-pallets-to-runtime/add-pallets-to-runtime-2.webp)"} +{"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-add-pallets-to-runtime", "page_title": "Add Pallets to the Runtime", "index": 5, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 12273, "end_char": 13025, "estimated_token_count": 183, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n
\n\n- Tutorial __Deploy on Paseo TestNet__\n\n ---\n\n Deploy your Polkadot SDK blockchain on Paseo! Follow this step-by-step guide for a seamless journey to a successful TestNet deployment.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/deploy-to-testnet/)\n\n- Tutorial __Pallet Benchmarking (Optional)__\n\n ---\n\n Discover how to measure extrinsic costs and assign precise weights to optimize your pallet for accurate fees and runtime performance.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/polkadot-sdk/parachains/zero-to-hero/pallet-benchmarking/)\n\n
"} {"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-build-custom-pallet", "page_title": "Build a Custom Pallet", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 25, "end_char": 1088, "estimated_token_count": 214, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nIn Polkadot SDK-based blockchains, runtime functionality is built through modular components called [pallets](/polkadot-protocol/glossary#pallet){target=\\_blank}. These pallets are Rust-based runtime modules created using [FRAME (Framework for Runtime Aggregation of Modular Entities)](/develop/parachains/customize-parachain/overview/){target=\\_blank}, a powerful library that simplifies blockchain development by providing specialized macros and standardized patterns for building blockchain logic.\nA pallet encapsulates a specific set of blockchain functionalities, such as managing token balances, implementing governance mechanisms, or creating custom state transitions.\n\nIn this tutorial, you'll learn how to create a custom pallet from scratch. You will develop a simple counter pallet with the following features:\n\n- Users can increment and decrement a counter.\n- Only a [root origin](https://paritytech.github.io/polkadot-sdk/master/frame_system/pallet/type.Origin.html#variant.Root){target=\\_blank} can set an arbitrary counter value."} {"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-build-custom-pallet", "page_title": "Build a Custom Pallet", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 1088, "end_char": 1378, "estimated_token_count": 85, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nYou'll use the [Polkadot SDK Parachain Template](https://github.com/paritytech/polkadot-sdk/tree/master/templates/parachain){target=\\_blank} created in the [Set Up a Template](/tutorials/polkadot-sdk/parachains/zero-to-hero/set-up-a-template/){target=\\_blank} tutorial."} {"page_id": "tutorials-polkadot-sdk-parachains-zero-to-hero-build-custom-pallet", "page_title": "Build a Custom Pallet", "index": 2, "depth": 2, "title": "Create a New Project", "anchor": "create-a-new-project", "start_char": 1378, "end_char": 2276, "estimated_token_count": 198, "token_estimator": "heuristic-v1", "text": "## Create a New Project\n\nIn this tutorial, you'll build a custom pallet from scratch to demonstrate the complete workflow, rather than starting with the pre-built `pallet-template`. The first step is to create a new Rust package for your pallet:\n\n1. Navigate to the `pallets` directory in your workspace:\n\n ```bash\n cd pallets\n ```\n\n2. Create a new Rust library project for your custom pallet by running the following command:\n\n ```bash\n cargo new --lib custom-pallet\n ```\n\n3. Enter the new project directory:\n\n ```bash\n cd custom-pallet\n ```\n\n4. Ensure the project was created successfully by checking its structure. The file layout should resemble the following:\n\n ```\n custom-pallet \n ├── Cargo.toml\n └── src\n └── lib.rs\n ```\n\n If the files are in place, your project setup is complete, and you're ready to start building your custom pallet."} @@ -1483,39 +1483,39 @@ {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 197, "end_char": 815, "estimated_token_count": 123, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nCreating [smart contracts](/develop/smart-contracts/overview/){target=\\_blank} is fundamental to blockchain development. While many frameworks and tools are available, understanding how to write a contract from scratch with just a text editor is essential knowledge.\n\nThis tutorial will guide you through creating a basic smart contract that can be used with other tutorials for deployment and integration on Polkadot Hub. To understand how smart contracts work in Polkadot Hub, check the [Smart Contract Basics](/polkadot-protocol/smart-contract-basics/){target=\\_blank} guide for more information."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 815, "end_char": 1267, "estimated_token_count": 119, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore starting, make sure you have:\n\n- A text editor of your choice ([VS Code](https://code.visualstudio.com/){target=\\_blank}, [Sublime Text](https://www.sublimetext.com/){target=\\_blank}, etc.).\n- Basic understanding of programming concepts.\n- Familiarity with the Solidity programming language syntax. For further references, check the official [Solidity documentation](https://docs.soliditylang.org/en/latest/){target=\\_blank}."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 2, "depth": 2, "title": "Understanding Smart Contract Structure", "anchor": "understanding-smart-contract-structure", "start_char": 1267, "end_char": 2249, "estimated_token_count": 216, "token_estimator": "heuristic-v1", "text": "## Understanding Smart Contract Structure\n\nLet's explore these components before building the contract:\n\n- **[SPDX license identifier](https://docs.soliditylang.org/en/v0.6.8/layout-of-source-files.html){target=\\_blank}**: A standardized way to declare the license under which your code is released. This helps with legal compliance and is required by the Solidity compiler to avoid warnings.\n- **Pragma directive**: Specifies which version of Solidity compiler should be used for your contract.\n- **Contract declaration**: Similar to a class in object-oriented programming, it defines the boundaries of your smart contract.\n- **State variables**: Data stored directly in the contract that persists between function calls. These represent the contract's \"state\" on the blockchain.\n- **Functions**: Executable code that can read or modify the contract's state variables.\n- **Events**: Notification mechanisms that applications can subscribe to in order to track blockchain changes."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 3, "depth": 2, "title": "Create the Smart Contract", "anchor": "create-the-smart-contract", "start_char": 2249, "end_char": 4545, "estimated_token_count": 480, "token_estimator": "heuristic-v1", "text": "## Create the Smart Contract\n\nIn this section, you'll build a simple storage contract step by step. This basic Storage contract is a great starting point for beginners. It introduces key concepts like state variables, functions, and events in a simple way, demonstrating how data is stored and updated on the blockchain. Later, you'll explore each component in more detail to understand what's happening behind the scenes.\n\nThis contract will:\n\n- Store a number.\n- Allow updating the stored number.\n- Emit an event when the number changes.\n\nTo build the smart contract, follow the steps below:\n\n1. Create a new file named `Storage.sol`.\n\n2. Add the SPDX license identifier at the top of the file:\n\n ```solidity\n // SPDX-License-Identifier: MIT\n ```\n\n This line tells users and tools which license governs your code. The [MIT license](https://opensource.org/license/mit){target=\\_blank} is commonly used for open-source projects. The Solidity compiler requires this line to avoid licensing-related warnings.\n\n3. Specify the Solidity version:\n\n ```solidity\n pragma solidity ^0.8.28;\n ```\n\n The caret `^` means \"this version or any compatible newer version.\" This helps ensure your contract compiles correctly with the intended compiler features.\n\n4. Create the contract structure:\n\n ```solidity\n contract Storage {\n // Contract code will go here\n }\n ```\n\n This defines a contract named \"Storage\", similar to how you would define a class in other programming languages.\n\n5. Add the state variables and event:\n\n ```solidity\n contract Storage {\n // State variable to store a number\n uint256 private number;\n \n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n }\n ```\n\n Here, you're defining:\n\n - A state variable named `number` of type `uint256` (unsigned integer with 256 bits), which is marked as `private` so it can only be accessed via functions within this contract.\n - An event named `NumberChanged` that will be triggered whenever the number changes. The event includes the new value as data.\n\n6. Add the getter and setter functions:\n\n ```solidity\n \n ```\n\n??? code \"Complete Storage.sol contract\"\n\n ```solidity title=\"Storage.sol\"\n \n ```"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 4, "depth": 2, "title": "Understanding the Code", "anchor": "understanding-the-code", "start_char": 4545, "end_char": 7112, "estimated_token_count": 524, "token_estimator": "heuristic-v1", "text": "## Understanding the Code\n\nLet's break down the key components of the contract:\n\n- **State Variable**\n\n - **`uint256 private number`**: A private variable that can only be accessed through the contract's functions.\n - The `private` keyword prevents direct access from other contracts, but it's important to note that while other contracts cannot read this variable directly, the data itself is still visible on the blockchain and can be read by external tools or applications that interact with the blockchain. \"Private\" in Solidity doesn't mean the data is encrypted or truly hidden.\n - State variables in Solidity are permanent storage on the blockchain, making them different from variables in traditional programming. Every change to a state variable requires a transaction and costs gas (the fee paid for blockchain operations).\n\n- **Event**\n\n - **`event NumberChanged(uint256 newNumber)`**: Emitted when the stored number changes.\n - When triggered, events write data to the blockchain's log, which can be efficiently queried by applications.\n - Unlike state variables, events cannot be read by smart contracts, only by external applications.\n - Events are much more gas-efficient than storing data when you only need to notify external systems of changes.\n\n- **Functions**\n\n - **`store(uint256 newNumber)`**: Updates the stored number and emits an event.\n - This function changes the state of the contract and requires a transaction to execute.\n - The `emit` keyword is used to trigger the defined event.\n\n - **`retrieve()`**: Returns the current stored number.\n - The `view` keyword indicates that this function only reads data and doesn't modify the contract's state.\n - View functions don't require a transaction and don't cost gas when called externally.\n\n For those new to Solidity, this naming pattern (getter/setter functions) is a common design pattern. Instead of directly accessing state variables, the convention is to use functions to control access and add additional logic if needed.\n\nThis basic contract serves as a foundation for learning smart contract development. Real-world contracts often require additional security considerations, more complex logic, and thorough testing before deployment.\n\nFor more detailed information about Solidity types, functions, and best practices, refer to the [Solidity documentation](https://docs.soliditylang.org/en/latest/){target=\\_blank} or this [beginner's guide to Solidity](https://www.tutorialspoint.com/solidity/index.htm){target=\\_blank}."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 5, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 7112, "end_char": 7480, "estimated_token_count": 98, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n\n
\n\n- Tutorial __Test and Deploy with Hardhat__\n\n ---\n\n Learn how to test and deploy the smart contract you created by using Hardhat.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/smart-contracts/launch-your-first-project/test-and-deploy-with-hardhat/)\n\n
"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 3, "depth": 2, "title": "Create the Smart Contract", "anchor": "create-the-smart-contract", "start_char": 2249, "end_char": 5735, "estimated_token_count": 680, "token_estimator": "heuristic-v1", "text": "## Create the Smart Contract\n\nIn this section, you'll build a simple storage contract step by step. This basic Storage contract is a great starting point for beginners. It introduces key concepts like state variables, functions, and events in a simple way, demonstrating how data is stored and updated on the blockchain. Later, you'll explore each component in more detail to understand what's happening behind the scenes.\n\nThis contract will:\n\n- Store a number.\n- Allow updating the stored number.\n- Emit an event when the number changes.\n\nTo build the smart contract, follow the steps below:\n\n1. Create a new file named `Storage.sol`.\n\n2. Add the SPDX license identifier at the top of the file:\n\n ```solidity\n // SPDX-License-Identifier: MIT\n ```\n\n This line tells users and tools which license governs your code. The [MIT license](https://opensource.org/license/mit){target=\\_blank} is commonly used for open-source projects. The Solidity compiler requires this line to avoid licensing-related warnings.\n\n3. Specify the Solidity version:\n\n ```solidity\n pragma solidity ^0.8.28;\n ```\n\n The caret `^` means \"this version or any compatible newer version.\" This helps ensure your contract compiles correctly with the intended compiler features.\n\n4. Create the contract structure:\n\n ```solidity\n contract Storage {\n // Contract code will go here\n }\n ```\n\n This defines a contract named \"Storage\", similar to how you would define a class in other programming languages.\n\n5. Add the state variables and event:\n\n ```solidity\n contract Storage {\n // State variable to store a number\n uint256 private number;\n \n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n }\n ```\n\n Here, you're defining:\n\n - A state variable named `number` of type `uint256` (unsigned integer with 256 bits), which is marked as `private` so it can only be accessed via functions within this contract.\n - An event named `NumberChanged` that will be triggered whenever the number changes. The event includes the new value as data.\n\n6. Add the getter and setter functions:\n\n ```solidity\n // SPDX-License-Identifier: MIT\n pragma solidity ^0.8.28;\n\n contract Storage {\n // State variable to store our number\n uint256 private number;\n\n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n\n // Function to store a new number\n function store(uint256 newNumber) public {\n number = newNumber;\n emit NumberChanged(newNumber);\n }\n\n // Function to retrieve the stored number\n function retrieve() public view returns (uint256) {\n return number;\n }\n }\n ```\n\n??? code \"Complete Storage.sol contract\"\n\n ```solidity title=\"Storage.sol\"\n // SPDX-License-Identifier: MIT\n pragma solidity ^0.8.28;\n\n contract Storage {\n // State variable to store our number\n uint256 private number;\n\n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n\n // Function to store a new number\n function store(uint256 newNumber) public {\n number = newNumber;\n emit NumberChanged(newNumber);\n }\n\n // Function to retrieve the stored number\n function retrieve() public view returns (uint256) {\n return number;\n }\n }\n ```"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 4, "depth": 2, "title": "Understanding the Code", "anchor": "understanding-the-code", "start_char": 5735, "end_char": 8302, "estimated_token_count": 524, "token_estimator": "heuristic-v1", "text": "## Understanding the Code\n\nLet's break down the key components of the contract:\n\n- **State Variable**\n\n - **`uint256 private number`**: A private variable that can only be accessed through the contract's functions.\n - The `private` keyword prevents direct access from other contracts, but it's important to note that while other contracts cannot read this variable directly, the data itself is still visible on the blockchain and can be read by external tools or applications that interact with the blockchain. \"Private\" in Solidity doesn't mean the data is encrypted or truly hidden.\n - State variables in Solidity are permanent storage on the blockchain, making them different from variables in traditional programming. Every change to a state variable requires a transaction and costs gas (the fee paid for blockchain operations).\n\n- **Event**\n\n - **`event NumberChanged(uint256 newNumber)`**: Emitted when the stored number changes.\n - When triggered, events write data to the blockchain's log, which can be efficiently queried by applications.\n - Unlike state variables, events cannot be read by smart contracts, only by external applications.\n - Events are much more gas-efficient than storing data when you only need to notify external systems of changes.\n\n- **Functions**\n\n - **`store(uint256 newNumber)`**: Updates the stored number and emits an event.\n - This function changes the state of the contract and requires a transaction to execute.\n - The `emit` keyword is used to trigger the defined event.\n\n - **`retrieve()`**: Returns the current stored number.\n - The `view` keyword indicates that this function only reads data and doesn't modify the contract's state.\n - View functions don't require a transaction and don't cost gas when called externally.\n\n For those new to Solidity, this naming pattern (getter/setter functions) is a common design pattern. Instead of directly accessing state variables, the convention is to use functions to control access and add additional logic if needed.\n\nThis basic contract serves as a foundation for learning smart contract development. Real-world contracts often require additional security considerations, more complex logic, and thorough testing before deployment.\n\nFor more detailed information about Solidity types, functions, and best practices, refer to the [Solidity documentation](https://docs.soliditylang.org/en/latest/){target=\\_blank} or this [beginner's guide to Solidity](https://www.tutorialspoint.com/solidity/index.htm){target=\\_blank}."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-contracts", "page_title": "Create a Smart Contract", "index": 5, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 8302, "end_char": 8670, "estimated_token_count": 98, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n\n
\n\n- Tutorial __Test and Deploy with Hardhat__\n\n ---\n\n Learn how to test and deploy the smart contract you created by using Hardhat.\n\n [:octicons-arrow-right-24: Get Started](/tutorials/smart-contracts/launch-your-first-project/test-and-deploy-with-hardhat/)\n\n
"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 202, "end_char": 1019, "estimated_token_count": 167, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nDecentralized applications (dApps) have become a cornerstone of the Web3 ecosystem, allowing developers to create applications that interact directly with blockchain networks. Polkadot Hub, a blockchain that supports smart contract functionality, provides an excellent platform for deploying and interacting with dApps.\n\nIn this tutorial, you'll build a complete dApp that interacts with a smart contract deployed on the Polkadot Hub TestNet. It will use [Ethers.js](/develop/smart-contracts/libraries/ethers-js){target=\\_blank} to interact with the blockchain and [Next.js](https://nextjs.org/){target=\\_blank} as the frontend framework. By the end of this tutorial, you'll have a functional dApp that allows users to connect their wallets, read data from the blockchain, and execute transactions."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 1019, "end_char": 1479, "estimated_token_count": 111, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore you begin, make sure you have:\n\n- [Node.js](https://nodejs.org/en){target=\\_blank} v16 or newer installed on your machine.\n- A crypto wallet (like MetaMask) with some test tokens. For further information, check the [Connect to Polkadot](/develop/smart-contracts/connect-to-polkadot){target=\\_blank} guide.\n- Basic understanding of React and JavaScript.\n- Familiarity with blockchain concepts and Solidity (helpful but not mandatory)."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 2, "depth": 2, "title": "Project Overview", "anchor": "project-overview", "start_char": 1479, "end_char": 2676, "estimated_token_count": 301, "token_estimator": "heuristic-v1", "text": "## Project Overview\n\nThe dApp will interact with a simple Storage contract. For a step-by-step guide on creating it, refer to the [Create Contracts](/tutorials/smart-contracts/launch-your-first-project/create-contracts){target=\\_blank} tutorial. This contract allows:\n\n- Reading a stored number from the blockchain.\n- Updating the stored number with a new value.\n\nThe contract has already been deployed to the Polkadot Hub TestNet for testing purposes: `0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f`. If you want to deploy your own, follow the [Deploying Contracts](/develop/smart-contracts/dev-environments/remix/#deploying-contracts){target=\\_blank} section.\n\nHere's a simplified view of what you'll be building:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-1.webp)\n\nThe general structure of the project should end up as follows:\n\n```bash\nethers-dapp\n├── abis\n│ └── Storage.json\n└── app\n ├── components\n │ ├── ReadContract.js\n │ ├── WalletConnect.js\n │ └── WriteContract.js\n ├── favicon.ico\n ├── globals.css\n ├── layout.js\n ├── page.js\n └── utils\n ├── contract.js\n └── ethers.js\n```"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 3, "depth": 2, "title": "Set Up the Project", "anchor": "set-up-the-project", "start_char": 2676, "end_char": 2923, "estimated_token_count": 77, "token_estimator": "heuristic-v1", "text": "## Set Up the Project\n\nLet's start by creating a new Next.js project:\n\n```bash\nnpx create-next-app ethers-dapp --js --eslint --tailwind --app --yes\ncd ethers-dapp\n```\n\nNext, install the needed dependencies:\n\n```bash\nnpm install ethers@6.13.5\n```"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 4, "depth": 2, "title": "Connect to Polkadot Hub", "anchor": "connect-to-polkadot-hub", "start_char": 2923, "end_char": 3781, "estimated_token_count": 217, "token_estimator": "heuristic-v1", "text": "## Connect to Polkadot Hub\n\nTo interact with the Polkadot Hub, you need to set up an [Ethers.js Provider](/develop/smart-contracts/libraries/ethers-js/#set-up-the-ethersjs-provider){target=\\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/ethers.js` and add the following code:\n\n```javascript title=\"app/utils/ethers.js\"\n\n```\n\nThis file establishes a connection to the Polkadot Hub TestNet and provides helper functions for obtaining a [Provider](https://docs.ethers.org/v5/api/providers/provider/){target=_blank} and [Signer](https://docs.ethers.org/v5/api/signer/){target=_blank}. The provider allows you to read data from the blockchain, while the signer enables users to send transactions and modify the blockchain state."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 5, "depth": 2, "title": "Set Up the Smart Contract Interface", "anchor": "set-up-the-smart-contract-interface", "start_char": 3781, "end_char": 4485, "estimated_token_count": 169, "token_estimator": "heuristic-v1", "text": "## Set Up the Smart Contract Interface\n\nFor this dApp, you'll use a simple Storage contract already deployed. So, you need to create an interface to interact with it. First, ensure to create a folder called `abis` at the root of your project, create a file `Storage.json`, and paste the corresponding ABI (Application Binary Interface) of the Storage contract. You can copy and paste the following:\n\n???+ code \"Storage.sol ABI\"\n\n ```json title=\"abis/Storage.json\"\n \n ```\n\nNow, create a file called `app/utils/contract.js`:\n\n```javascript title=\"app/utils/contract.js\"\n\n```\n\nThis file defines the contract address, ABI, and functions to create instances of the contract for reading and writing."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 6, "depth": 2, "title": "Create the Wallet Connection Component", "anchor": "create-the-wallet-connection-component", "start_char": 4485, "end_char": 5247, "estimated_token_count": 188, "token_estimator": "heuristic-v1", "text": "## Create the Wallet Connection Component\n\nNext, let's create a component to handle wallet connections. Create a new file called `app/components/WalletConnect.js`:\n\n```javascript title=\"app/components/WalletConnect.js\"\n\n```\n\nThis component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. \n\nTo integrate this component to your dApp, you need to overwrite the existing boilerplate in `app/page.js` with the following code:\n\n```javascript title=\"app/page.js\"\n\n\n\n\n```\n\nIn your terminal, you can launch your project by running:\n\n```bash\nnpm run dev\n```\n\nAnd you will see the following:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-2.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 7, "depth": 2, "title": "Read Data from the Blockchain", "anchor": "read-data-from-the-blockchain", "start_char": 5247, "end_char": 5940, "estimated_token_count": 177, "token_estimator": "heuristic-v1", "text": "## Read Data from the Blockchain\n\nNow, let's create a component to read data from the contract. Create a file called `app/components/ReadContract.js`:\n\n```javascript title=\"app/components/ReadContract.js\"\n\n```\n\nThis component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically.\n\nTo see this change in your dApp, you need to integrate this component into the `app/page.js` file:\n\n```javascript title=\"app/page.js\"\n\n\n\n\n```\n\nYour dApp will automatically be updated to the following:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-3.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 8, "depth": 2, "title": "Write Data to the Blockchain", "anchor": "write-data-to-the-blockchain", "start_char": 5940, "end_char": 6681, "estimated_token_count": 181, "token_estimator": "heuristic-v1", "text": "## Write Data to the Blockchain\n\nFinally, let's create a component that allows users to update the stored number. Create a file called `app/components/WriteContract.js`:\n\n```javascript title=\"app/components/WriteContract.js\"\n\n```\n\nThis component allows users to input a new number and send a transaction to update the value stored in the contract. When the transaction is successful, users will see the stored value update in the `ReadContract` component after the transaction is confirmed.\n\nUpdate the `app/page.js` file to integrate all components:\n\n```javascript title=\"app/page.js\"\n\n```\n\nThe completed UI will display:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-4.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 9, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 6681, "end_char": 7495, "estimated_token_count": 171, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've built a complete dApp that interacts with a smart contract on the Polkadot Hub TestNet using Ethers.js and Next.js. Your application can now:\n\n- Connect to a user's wallet.\n- Read data from a smart contract.\n- Send transactions to update the contract state.\n\nThese fundamental skills provide the foundation for building more complex dApps on Polkadot Hub. With these building blocks, you can extend your application to interact with more sophisticated smart contracts and create more advanced user interfaces.\n\nTo get started right away with a working example, you can clone the repository and navigate to the implementation:\n\n```\ngit clone https://github.com/polkadot-developers/polkavm-storage-contract-dapps.git -b v0.0.2\ncd polkavm-storage-contract-dapps/ethers-dapp\n```"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 4, "depth": 2, "title": "Connect to Polkadot Hub", "anchor": "connect-to-polkadot-hub", "start_char": 2923, "end_char": 4631, "estimated_token_count": 417, "token_estimator": "heuristic-v1", "text": "## Connect to Polkadot Hub\n\nTo interact with the Polkadot Hub, you need to set up an [Ethers.js Provider](/develop/smart-contracts/libraries/ethers-js/#set-up-the-ethersjs-provider){target=\\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/ethers.js` and add the following code:\n\n```javascript title=\"app/utils/ethers.js\"\nimport { JsonRpcProvider } from 'ethers';\n\nexport const PASSET_HUB_CONFIG = {\n name: 'Passet Hub',\n rpc: 'https://testnet-passet-hub-eth-rpc.polkadot.io/', // Passet Hub testnet RPC\n chainId: 420420422, // Passet Hub testnet chainId\n blockExplorer: 'https://blockscout-passet-hub.parity-testnet.parity.io/',\n};\n\nexport const getProvider = () => {\n return new JsonRpcProvider(PASSET_HUB_CONFIG.rpc, {\n chainId: PASSET_HUB_CONFIG.chainId,\n name: PASSET_HUB_CONFIG.name,\n });\n};\n\n// Helper to get a signer from a provider\nexport const getSigner = async (provider) => {\n if (window.ethereum) {\n await window.ethereum.request({ method: 'eth_requestAccounts' });\n const ethersProvider = new ethers.BrowserProvider(window.ethereum);\n return ethersProvider.getSigner();\n }\n throw new Error('No Ethereum browser provider detected');\n};\n```\n\nThis file establishes a connection to the Polkadot Hub TestNet and provides helper functions for obtaining a [Provider](https://docs.ethers.org/v5/api/providers/provider/){target=_blank} and [Signer](https://docs.ethers.org/v5/api/signer/){target=_blank}. The provider allows you to read data from the blockchain, while the signer enables users to send transactions and modify the blockchain state."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 5, "depth": 2, "title": "Set Up the Smart Contract Interface", "anchor": "set-up-the-smart-contract-interface", "start_char": 4631, "end_char": 6548, "estimated_token_count": 403, "token_estimator": "heuristic-v1", "text": "## Set Up the Smart Contract Interface\n\nFor this dApp, you'll use a simple Storage contract already deployed. So, you need to create an interface to interact with it. First, ensure to create a folder called `abis` at the root of your project, create a file `Storage.json`, and paste the corresponding ABI (Application Binary Interface) of the Storage contract. You can copy and paste the following:\n\n???+ code \"Storage.sol ABI\"\n\n ```json title=\"abis/Storage.json\"\n [\n {\n \"inputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"_newNumber\",\n \"type\": \"uint256\"\n }\n ],\n \"name\": \"setNumber\",\n \"outputs\": [],\n \"stateMutability\": \"nonpayable\",\n \"type\": \"function\"\n },\n {\n \"inputs\": [],\n \"name\": \"storedNumber\",\n \"outputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"\",\n \"type\": \"uint256\"\n }\n ],\n \"stateMutability\": \"view\",\n \"type\": \"function\"\n }\n ]\n ```\n\nNow, create a file called `app/utils/contract.js`:\n\n```javascript title=\"app/utils/contract.js\"\nimport { Contract } from 'ethers';\nimport { getProvider } from './ethers';\nimport StorageABI from '../../abis/Storage.json';\n\nexport const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f';\n\nexport const CONTRACT_ABI = StorageABI;\n\nexport const getContract = () => {\n const provider = getProvider();\n return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, provider);\n};\n\nexport const getSignedContract = async (signer) => {\n return new Contract(CONTRACT_ADDRESS, CONTRACT_ABI, signer);\n};\n```\n\nThis file defines the contract address, ABI, and functions to create instances of the contract for reading and writing."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 6, "depth": 2, "title": "Create the Wallet Connection Component", "anchor": "create-the-wallet-connection-component", "start_char": 6548, "end_char": 12876, "estimated_token_count": 1445, "token_estimator": "heuristic-v1", "text": "## Create the Wallet Connection Component\n\nNext, let's create a component to handle wallet connections. Create a new file called `app/components/WalletConnect.js`:\n\n```javascript title=\"app/components/WalletConnect.js\"\n'use client';\n\nimport React, { useState, useEffect } from 'react';\nimport { PASSET_HUB_CONFIG } from '../utils/ethers';\n\nconst WalletConnect = ({ onConnect }) => {\n const [account, setAccount] = useState(null);\n const [chainId, setChainId] = useState(null);\n const [error, setError] = useState(null);\n\n useEffect(() => {\n // Check if user already has an authorized wallet connection\n const checkConnection = async () => {\n if (window.ethereum) {\n try {\n // eth_accounts doesn't trigger the wallet popup\n const accounts = await window.ethereum.request({\n method: 'eth_accounts',\n });\n if (accounts.length > 0) {\n setAccount(accounts[0]);\n const chainIdHex = await window.ethereum.request({\n method: 'eth_chainId',\n });\n setChainId(parseInt(chainIdHex, 16));\n }\n } catch (err) {\n console.error('Error checking connection:', err);\n setError('Failed to check wallet connection');\n }\n }\n };\n\n checkConnection();\n\n if (window.ethereum) {\n // Setup wallet event listeners\n window.ethereum.on('accountsChanged', (accounts) => {\n setAccount(accounts[0] || null);\n if (accounts[0] && onConnect) onConnect(accounts[0]);\n });\n\n window.ethereum.on('chainChanged', (chainIdHex) => {\n setChainId(parseInt(chainIdHex, 16));\n });\n }\n\n return () => {\n // Cleanup event listeners\n if (window.ethereum) {\n window.ethereum.removeListener('accountsChanged', () => {});\n window.ethereum.removeListener('chainChanged', () => {});\n }\n };\n }, [onConnect]);\n\n const connectWallet = async () => {\n if (!window.ethereum) {\n setError(\n 'MetaMask not detected! Please install MetaMask to use this dApp.'\n );\n return;\n }\n\n try {\n // eth_requestAccounts triggers the wallet popup\n const accounts = await window.ethereum.request({\n method: 'eth_requestAccounts',\n });\n setAccount(accounts[0]);\n\n const chainIdHex = await window.ethereum.request({\n method: 'eth_chainId',\n });\n const currentChainId = parseInt(chainIdHex, 16);\n setChainId(currentChainId);\n\n // Prompt user to switch networks if needed\n if (currentChainId !== PASSET_HUB_CONFIG.chainId) {\n await switchNetwork();\n }\n\n if (onConnect) onConnect(accounts[0]);\n } catch (err) {\n console.error('Error connecting to wallet:', err);\n setError('Failed to connect wallet');\n }\n };\n\n const switchNetwork = async () => {\n try {\n await window.ethereum.request({\n method: 'wallet_switchEthereumChain',\n params: [{ chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}` }],\n });\n } catch (switchError) {\n // Error 4902 means the chain hasn't been added to MetaMask\n if (switchError.code === 4902) {\n try {\n await window.ethereum.request({\n method: 'wallet_addEthereumChain',\n params: [\n {\n chainId: `0x${PASSET_HUB_CONFIG.chainId.toString(16)}`,\n chainName: PASSET_HUB_CONFIG.name,\n rpcUrls: [PASSET_HUB_CONFIG.rpc],\n blockExplorerUrls: [PASSET_HUB_CONFIG.blockExplorer],\n },\n ],\n });\n } catch (addError) {\n setError('Failed to add network to wallet');\n }\n } else {\n setError('Failed to switch network');\n }\n }\n };\n\n // UI-only disconnection - MetaMask doesn't support programmatic disconnection\n const disconnectWallet = () => {\n setAccount(null);\n };\n\n return (\n
\n {error &&

{error}

}\n\n {!account ? (\n \n Connect Wallet\n \n ) : (\n
\n \n {`${account.substring(0, 6)}...${account.substring(38)}`}\n \n \n Disconnect\n \n {chainId !== PASSET_HUB_CONFIG.chainId && (\n \n Switch to Passet Hub\n \n )}\n
\n )}\n
\n );\n};\n\nexport default WalletConnect;\n```\n\nThis component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. \n\nTo integrate this component to your dApp, you need to overwrite the existing boilerplate in `app/page.js` with the following code:\n\n```javascript title=\"app/page.js\"\n\nimport { useState } from 'react';\n\nimport WalletConnect from './components/WalletConnect';\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Ethers.js dApp - Passet Hub Smart Contracts\n

\n \n
\n );\n}\n```\n\nIn your terminal, you can launch your project by running:\n\n```bash\nnpm run dev\n```\n\nAnd you will see the following:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-2.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 7, "depth": 2, "title": "Read Data from the Blockchain", "anchor": "read-data-from-the-blockchain", "start_char": 12876, "end_char": 16092, "estimated_token_count": 805, "token_estimator": "heuristic-v1", "text": "## Read Data from the Blockchain\n\nNow, let's create a component to read data from the contract. Create a file called `app/components/ReadContract.js`:\n\n```javascript title=\"app/components/ReadContract.js\"\n'use client';\n\nimport React, { useState, useEffect } from 'react';\nimport { getContract } from '../utils/contract';\n\nconst ReadContract = () => {\n const [storedNumber, setStoredNumber] = useState(null);\n const [loading, setLoading] = useState(true);\n const [error, setError] = useState(null);\n\n useEffect(() => {\n // Function to read data from the blockchain\n const fetchData = async () => {\n try {\n setLoading(true);\n const contract = getContract();\n // Call the smart contract's storedNumber function\n const number = await contract.storedNumber();\n setStoredNumber(number.toString());\n setError(null);\n } catch (err) {\n console.error('Error fetching stored number:', err);\n setError('Failed to fetch data from the contract');\n } finally {\n setLoading(false);\n }\n };\n\n fetchData();\n\n // Poll for updates every 10 seconds to keep UI in sync with blockchain\n const interval = setInterval(fetchData, 10000);\n\n // Clean up interval on component unmount\n return () => clearInterval(interval);\n }, []);\n\n return (\n
\n

Contract Data

\n {loading ? (\n
\n
\n
\n ) : error ? (\n

{error}

\n ) : (\n
\n

\n Stored Number: {storedNumber}\n

\n
\n )}\n
\n );\n};\n\nexport default ReadContract;\n```\n\nThis component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically.\n\nTo see this change in your dApp, you need to integrate this component into the `app/page.js` file:\n\n```javascript title=\"app/page.js\"\n\nimport { useState } from 'react';\n\nimport WalletConnect from './components/WalletConnect';\nimport ReadContract from './components/ReadContract';\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Ethers.js dApp - Passet Hub Smart Contracts\n

\n \n \n
\n );\n}\n```\n\nYour dApp will automatically be updated to the following:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-3.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 8, "depth": 2, "title": "Write Data to the Blockchain", "anchor": "write-data-to-the-blockchain", "start_char": 16092, "end_char": 21164, "estimated_token_count": 1229, "token_estimator": "heuristic-v1", "text": "## Write Data to the Blockchain\n\nFinally, let's create a component that allows users to update the stored number. Create a file called `app/components/WriteContract.js`:\n\n```javascript title=\"app/components/WriteContract.js\"\n'use client';\n\nimport { useState } from 'react';\nimport { getSignedContract } from '../utils/contract';\nimport { ethers } from 'ethers';\n\nconst WriteContract = ({ account }) => {\n const [newNumber, setNewNumber] = useState('');\n const [status, setStatus] = useState({ type: null, message: '' });\n const [isSubmitting, setIsSubmitting] = useState(false);\n\n const handleSubmit = async (e) => {\n e.preventDefault();\n\n // Validation checks\n if (!account) {\n setStatus({ type: 'error', message: 'Please connect your wallet first' });\n return;\n }\n\n if (!newNumber || isNaN(Number(newNumber))) {\n setStatus({ type: 'error', message: 'Please enter a valid number' });\n return;\n }\n\n try {\n setIsSubmitting(true);\n setStatus({ type: 'info', message: 'Initiating transaction...' });\n\n // Get a signer from the connected wallet\n const provider = new ethers.BrowserProvider(window.ethereum);\n const signer = await provider.getSigner();\n const contract = await getSignedContract(signer);\n\n // Send transaction to blockchain and wait for user confirmation in wallet\n setStatus({\n type: 'info',\n message: 'Please confirm the transaction in your wallet...',\n });\n\n // Call the contract's setNumber function\n const tx = await contract.setNumber(newNumber);\n\n // Wait for transaction to be mined\n setStatus({\n type: 'info',\n message: 'Transaction submitted. Waiting for confirmation...',\n });\n const receipt = await tx.wait();\n\n setStatus({\n type: 'success',\n message: `Transaction confirmed! Transaction hash: ${receipt.hash}`,\n });\n setNewNumber('');\n } catch (err) {\n console.error('Error updating number:', err);\n\n // Error code 4001 is MetaMask's code for user rejection\n if (err.code === 4001) {\n setStatus({ type: 'error', message: 'Transaction rejected by user.' });\n } else {\n setStatus({\n type: 'error',\n message: `Error: ${err.message || 'Failed to send transaction'}`,\n });\n }\n } finally {\n setIsSubmitting(false);\n }\n };\n\n return (\n
\n

Update Stored Number

\n {status.message && (\n \n {status.message}\n
\n )}\n
\n setNewNumber(e.target.value)}\n disabled={isSubmitting || !account}\n className=\"w-full p-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-pink-400\"\n />\n \n {isSubmitting ? 'Updating...' : 'Update'}\n \n \n {!account && (\n

\n Connect your wallet to update the stored number.\n

\n )}\n
\n );\n};\n\nexport default WriteContract;\n```\n\nThis component allows users to input a new number and send a transaction to update the value stored in the contract. When the transaction is successful, users will see the stored value update in the `ReadContract` component after the transaction is confirmed.\n\nUpdate the `app/page.js` file to integrate all components:\n\n```javascript title=\"app/page.js\"\n'use client';\n\nimport { useState } from 'react';\n\nimport WalletConnect from './components/WalletConnect';\nimport ReadContract from './components/ReadContract';\nimport WriteContract from './components/WriteContract';\n\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Ethers.js dApp - Passet Hub Smart Contracts\n

\n \n \n \n
\n );\n}\n```\n\nThe completed UI will display:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-ethers-js/create-dapp-ethers-js-4.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-ethers-js", "page_title": "Create a dApp With Ethers.js", "index": 9, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 21164, "end_char": 21978, "estimated_token_count": 171, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've built a complete dApp that interacts with a smart contract on the Polkadot Hub TestNet using Ethers.js and Next.js. Your application can now:\n\n- Connect to a user's wallet.\n- Read data from a smart contract.\n- Send transactions to update the contract state.\n\nThese fundamental skills provide the foundation for building more complex dApps on Polkadot Hub. With these building blocks, you can extend your application to interact with more sophisticated smart contracts and create more advanced user interfaces.\n\nTo get started right away with a working example, you can clone the repository and navigate to the implementation:\n\n```\ngit clone https://github.com/polkadot-developers/polkavm-storage-contract-dapps.git -b v0.0.2\ncd polkavm-storage-contract-dapps/ethers-dapp\n```"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 0, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 890, "end_char": 1375, "estimated_token_count": 115, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore getting started, ensure you have the following:\n\n- [Node.js](https://nodejs.org/en){target=\\_blank} v16 or later installed on your system.\n- A crypto wallet (such as MetaMask) funded with test tokens. Refer to the [Connect to Polkadot](/develop/smart-contracts/connect-to-polkadot){target=\\_blank} guide for more details.\n- A basic understanding of React and JavaScript.\n- Some familiarity with blockchain fundamentals and Solidity (useful but not required)."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 1, "depth": 2, "title": "Project Overview", "anchor": "project-overview", "start_char": 1375, "end_char": 2276, "estimated_token_count": 235, "token_estimator": "heuristic-v1", "text": "## Project Overview\n\nThis dApp will interact with a basic Storage contract. Refer to the [Create Contracts](/tutorials/smart-contracts/launch-your-first-project/create-contracts){target=\\_blank} tutorial for a step-by-step guide on creating this contract. The contract allows:\n\n- Retrieving a stored number from the blockchain.\n- Updating the stored number with a new value.\n\n\nBelow is a high-level overview of what you'll be building:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-1.webp)\n\nYour project directory will be organized as follows:\n\n```bash\nviem-dapp\n├── abis\n│ └── Storage.json\n└── app\n ├── components\n │ ├── ReadContract.tsx\n │ ├── WalletConnect.tsx\n │ └── WriteContract.tsx\n ├── favicon.ico\n ├── globals.css\n ├── layout.tsx\n ├── page.tsx\n └── utils\n ├── contract.ts\n └── viem.ts\n```"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 2, "depth": 2, "title": "Set Up the Project", "anchor": "set-up-the-project", "start_char": 2276, "end_char": 2423, "estimated_token_count": 49, "token_estimator": "heuristic-v1", "text": "## Set Up the Project\n\nCreate a new Next.js project:\n\n```bash\nnpx create-next-app viem-dapp --ts --eslint --tailwind --app --yes\ncd viem-dapp\n```"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 3, "depth": 2, "title": "Install Dependencies", "anchor": "install-dependencies", "start_char": 2423, "end_char": 2567, "estimated_token_count": 38, "token_estimator": "heuristic-v1", "text": "## Install Dependencies\n\nInstall viem and related packages:\n\n```bash\nnpm install viem@2.23.6\nnpm install --save-dev typescript @types/node\n```"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 4, "depth": 2, "title": "Connect to Polkadot Hub", "anchor": "connect-to-polkadot-hub", "start_char": 2567, "end_char": 3520, "estimated_token_count": 238, "token_estimator": "heuristic-v1", "text": "## Connect to Polkadot Hub\n\nTo interact with Polkadot Hub, you need to set up a [Public Client](https://viem.sh/docs/clients/public#public-client){target=\\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/viem.ts` and add the following code:\n\n```typescript title=\"viem.ts\"\n\n```\n\nThis file initializes a viem client, providing helper functions for obtaining a Public Client and a [Wallet Client](https://viem.sh/docs/clients/wallet#wallet-client){target=\\_blank}. The Public Client enables reading blockchain data, while the Wallet Client allows users to sign and send transactions. Also, note that by importing `'viem/window'` the global `window.ethereum` will be typed as an `EIP1193Provider`, check the [`window` Polyfill](https://viem.sh/docs/typescript#window-polyfill){target=\\_blank} reference for more information."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 5, "depth": 2, "title": "Set Up the Smart Contract Interface", "anchor": "set-up-the-smart-contract-interface", "start_char": 3520, "end_char": 5274, "estimated_token_count": 373, "token_estimator": "heuristic-v1", "text": "## Set Up the Smart Contract Interface\n\nFor this dApp, you'll use a simple [Storage contract](/tutorials/smart-contracts/launch-your-first-project/create-contracts){target=\\_blank} that's already deployed in the Polkadot Hub TestNet: `0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f`. To interact with it, you need to define the contract interface.\n\nCreate a folder called `abis` at the root of your project, then create a file named `Storage.json` and paste the corresponding ABI (Application Binary Interface) of the Storage contract. You can copy and paste the following:\n\n??? code \"Storage.sol ABI\"\n ```json title=\"Storage.json\"\n [\n {\n \"inputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"_newNumber\",\n \"type\": \"uint256\"\n }\n ],\n \"name\": \"setNumber\",\n \"outputs\": [],\n \"stateMutability\": \"nonpayable\",\n \"type\": \"function\"\n },\n {\n \"inputs\": [],\n \"name\": \"storedNumber\",\n \"outputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"\",\n \"type\": \"uint256\"\n }\n ],\n \"stateMutability\": \"view\",\n \"type\": \"function\"\n }\n ]\n ```\n\nNext, create a file called `utils/contract.ts`:\n\n```typescript title=\"contract.ts\"\n\n```\n\nThis file defines the contract address, ABI, and functions to create a viem [contract instance](https://viem.sh/docs/contract/getContract#contract-instances){target=\\_blank} for reading and writing operations. viem's contract utilities ensure a more efficient and type-safe interaction with smart contracts."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 6, "depth": 2, "title": "Create the Wallet Connection Component", "anchor": "create-the-wallet-connection-component", "start_char": 5274, "end_char": 6261, "estimated_token_count": 230, "token_estimator": "heuristic-v1", "text": "## Create the Wallet Connection Component\n\nNow, let's create a component to handle wallet connections. Create a new file called `components/WalletConnect.tsx`:\n\n```typescript title=\"WalletConnect.tsx\"\n\n```\n\nThis component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. It provides a button for users to connect their wallet and displays the connected account address once connected.\n\nTo use this component in your dApp, replace the existing boilerplate in `app/page.tsx` with the following code:\n\n```typescript title=\"page.tsx\"\n\n\n\n\n```\n\nNow you're ready to run your dApp. From your project directory, execute:\n\n```bash\nnpm run dev\n```\n\nNavigate to `http://localhost:3000` in your browser, and you should see your dApp with the wallet connection button, the stored number display, and the form to update the number.\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-2.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 7, "depth": 2, "title": "Create the Read Contract Component", "anchor": "create-the-read-contract-component", "start_char": 6261, "end_char": 6962, "estimated_token_count": 172, "token_estimator": "heuristic-v1", "text": "## Create the Read Contract Component\n\nNow, let's create a component to read data from the contract. Create a file called `components/ReadContract.tsx`:\n\n```typescript title=\"ReadContract.tsx\"\n\n```\n\nThis component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically, ensuring that the UI stays in sync with the blockchain state.\n\nTo reflect this change in your dApp, incorporate this component into the `app/page.tsx` file.\n\n```typescript title=\"page.tsx\"\n\n\n\n\n```\n\nAnd you will see in your browser:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-3.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 8, "depth": 2, "title": "Create the Write Contract Component", "anchor": "create-the-write-contract-component", "start_char": 6962, "end_char": 14164, "estimated_token_count": 1627, "token_estimator": "heuristic-v1", "text": "## Create the Write Contract Component\n\nFinally, let's create a component that allows users to update the stored number. Create a file called `components/WriteContract.tsx`:\n\n```typescript title=\"WriteContract.tsx\"\n\"use client\";\n\nimport React, { useState, useEffect } from \"react\";\nimport { publicClient, getWalletClient } from \"../utils/viem\";\nimport { CONTRACT_ADDRESS, CONTRACT_ABI } from \"../utils/contract\";\n\ninterface WriteContractProps {\n account: string | null;\n}\n\nconst WriteContract: React.FC = ({ account }) => {\n const [newNumber, setNewNumber] = useState(\"\");\n const [status, setStatus] = useState<{\n type: string | null;\n message: string;\n }>({\n type: null,\n message: \"\",\n });\n const [isSubmitting, setIsSubmitting] = useState(false);\n const [isCorrectNetwork, setIsCorrectNetwork] = useState(true);\n\n // Check if the account is on the correct network\n useEffect(() => {\n const checkNetwork = async () => {\n if (!account) return;\n\n try {\n // Get the chainId from the public client\n const chainId = await publicClient.getChainId();\n\n // Get the user's current chainId from their wallet\n const walletClient = await getWalletClient();\n if (!walletClient) return;\n\n const walletChainId = await walletClient.getChainId();\n\n // Check if they match\n setIsCorrectNetwork(chainId === walletChainId);\n } catch (err) {\n console.error(\"Error checking network:\", err);\n setIsCorrectNetwork(false);\n }\n };\n\n checkNetwork();\n }, [account]);\n\n const handleSubmit = async (e: React.FormEvent) => {\n e.preventDefault();\n\n // Validation checks\n if (!account) {\n setStatus({ type: \"error\", message: \"Please connect your wallet first\" });\n return;\n }\n\n if (!isCorrectNetwork) {\n setStatus({\n type: \"error\",\n message: \"Please switch to the correct network in your wallet\",\n });\n return;\n }\n\n if (!newNumber || isNaN(Number(newNumber))) {\n setStatus({ type: \"error\", message: \"Please enter a valid number\" });\n return;\n }\n\n try {\n setIsSubmitting(true);\n setStatus({ type: \"info\", message: \"Initiating transaction...\" });\n\n // Get wallet client for transaction signing\n const walletClient = await getWalletClient();\n\n if (!walletClient) {\n setStatus({ type: \"error\", message: \"Wallet client not available\" });\n return;\n }\n\n // Check if account matches\n if (\n walletClient.account?.address.toLowerCase() !== account.toLowerCase()\n ) {\n setStatus({\n type: \"error\",\n message:\n \"Connected wallet account doesn't match the selected account\",\n });\n return;\n }\n\n // Prepare transaction and wait for user confirmation in wallet\n setStatus({\n type: \"info\",\n message: \"Please confirm the transaction in your wallet...\",\n });\n\n // Simulate the contract call first\n console.log('newNumber', newNumber);\n const { request } = await publicClient.simulateContract({\n address: CONTRACT_ADDRESS,\n abi: CONTRACT_ABI,\n functionName: \"setNumber\",\n args: [BigInt(newNumber)],\n account: walletClient.account,\n });\n\n // Send the transaction with wallet client\n const hash = await walletClient.writeContract(request);\n\n // Wait for transaction to be mined\n setStatus({\n type: \"info\",\n message: \"Transaction submitted. Waiting for confirmation...\",\n });\n\n const receipt = await publicClient.waitForTransactionReceipt({\n hash,\n });\n\n setStatus({\n type: \"success\",\n message: `Transaction confirmed! Transaction hash: ${receipt.transactionHash}`,\n });\n\n setNewNumber(\"\");\n } catch (err: any) {\n console.error(\"Error updating number:\", err);\n\n // Handle specific errors\n if (err.code === 4001) {\n // User rejected transaction\n setStatus({ type: \"error\", message: \"Transaction rejected by user.\" });\n } else if (err.message?.includes(\"Account not found\")) {\n // Account not found on the network\n setStatus({\n type: \"error\",\n message:\n \"Account not found on current network. Please check your wallet is connected to the correct network.\",\n });\n } else if (err.message?.includes(\"JSON is not a valid request object\")) {\n // JSON error - specific to your current issue\n setStatus({\n type: \"error\",\n message:\n \"Invalid request format. Please try again or contact support.\",\n });\n } else {\n // Other errors\n setStatus({\n type: \"error\",\n message: `Error: ${err.message || \"Failed to send transaction\"}`,\n });\n }\n } finally {\n setIsSubmitting(false);\n }\n };\n\n return (\n
\n

Update Stored Number

\n\n {!isCorrectNetwork && account && (\n
\n ⚠️ You are not connected to the correct network. Please switch\n networks in your wallet.\n
\n )}\n\n {status.message && (\n \n {status.message}\n
\n )}\n\n
\n setNewNumber(e.target.value)}\n disabled={isSubmitting || !account}\n className=\"w-full p-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-pink-400\"\n />\n \n {isSubmitting ? \"Updating...\" : \"Update\"}\n \n \n\n {!account && (\n

\n Connect your wallet to update the stored number.\n

\n )}\n \n );\n};\n\nexport default WriteContract;\n```\n\nThis component allows users to input a new number and send a transaction to update the value stored in the contract. It provides appropriate feedback during each step of the transaction process and handles error scenarios.\n\nUpdate the `app/page.tsx` file to integrate all components:\n\n```typescript title=\"page.tsx\"\n\n```\nAfter that, you will see:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-4.webp)"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 9, "depth": 2, "title": "How It Works", "anchor": "how-it-works", "start_char": 14164, "end_char": 15308, "estimated_token_count": 217, "token_estimator": "heuristic-v1", "text": "## How It Works\n\nLet's examine how the dApp interacts with the blockchain:\n\n1. Wallet connection: \n\n - The `WalletConnect` component uses the browser's Ethereum provider (MetaMask) to connect to the user's wallet.\n - It handles network switching to ensure the user is connected to the Polkadot Hub TestNet.\n - Once connected, it provides the user's account address to the parent component.\n\n2. Reading data:\n\n - The `ReadContract` component uses viem's `readContract` function to call the `storedNumber` view function.\n - It periodically polls for updates to keep the UI in sync with the blockchain state.\n - The component displays a loading indicator while fetching data and handles error states.\n\n3. Writing data:\n\n - The `WriteContract` component uses viem's `writeContract` function to send a transaction to the `setNumber` function.\n - It ensures the wallet is connected before allowing a transaction.\n - The component shows detailed feedback during transaction submission and confirmation.\n - After a successful transaction, the value displayed in the `ReadContract` component will update on the next poll."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 10, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 15308, "end_char": 16165, "estimated_token_count": 175, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've successfully built a fully functional dApp that interacts with a smart contract on Polkadot Hub using viem and Next.js. Your application can now:\n\n- Connect to a user's wallet and handle network switching.\n- Read data from a smart contract and keep it updated.\n- Write data to the blockchain through transactions.\n\nThese fundamental skills provide the foundation for building more complex dApps on Polkadot Hub. With this knowledge, you can extend your application to interact with more sophisticated smart contracts and create advanced user interfaces.\n\nTo get started right away with a working example, you can clone the repository and navigate to the implementation:\n\n```\ngit clone https://github.com/polkadot-developers/polkavm-storage-contract-dapps.git -b v0.0.2\ncd polkavm-storage-contract-dapps/viem-dapp\n```"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 11, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 16165, "end_char": 16479, "estimated_token_count": 81, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n
\n\n- Guide __Create a dApp with Wagmi__\n\n ---\n\n Learn how to build a decentralized application by using the Wagmi framework.\n\n [:octicons-arrow-right-24: Get Started](/develop/smart-contracts/libraries/wagmi)\n\n
"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 4, "depth": 2, "title": "Connect to Polkadot Hub", "anchor": "connect-to-polkadot-hub", "start_char": 2567, "end_char": 4571, "estimated_token_count": 491, "token_estimator": "heuristic-v1", "text": "## Connect to Polkadot Hub\n\nTo interact with Polkadot Hub, you need to set up a [Public Client](https://viem.sh/docs/clients/public#public-client){target=\\_blank} that connects to the blockchain. In this example, you will interact with the Polkadot Hub TestNet, so you can experiment safely. Start by creating a new file called `utils/viem.ts` and add the following code:\n\n```typescript title=\"viem.ts\"\nimport { createPublicClient, http, createWalletClient, custom } from 'viem'\nimport 'viem/window';\n\n\nconst transport = http('https://testnet-passet-hub-eth-rpc.polkadot.io')\n\n// Configure the Passet Hub chain\nexport const passetHub = {\n id: 420420422,\n name: 'Passet Hub',\n network: 'passet-hub',\n nativeCurrency: {\n decimals: 18,\n name: 'PAS',\n symbol: 'PAS',\n },\n rpcUrls: {\n default: {\n http: ['https://testnet-passet-hub-eth-rpc.polkadot.io'],\n },\n },\n} as const\n\n// Create a public client for reading data\nexport const publicClient = createPublicClient({\n chain: passetHub,\n transport\n})\n\n// Create a wallet client for signing transactions\nexport const getWalletClient = async () => {\n if (typeof window !== 'undefined' && window.ethereum) {\n const [account] = await window.ethereum.request({ method: 'eth_requestAccounts' });\n return createWalletClient({\n chain: passetHub,\n transport: custom(window.ethereum),\n account,\n });\n }\n throw new Error('No Ethereum browser provider detected');\n};\n```\n\nThis file initializes a viem client, providing helper functions for obtaining a Public Client and a [Wallet Client](https://viem.sh/docs/clients/wallet#wallet-client){target=\\_blank}. The Public Client enables reading blockchain data, while the Wallet Client allows users to sign and send transactions. Also, note that by importing `'viem/window'` the global `window.ethereum` will be typed as an `EIP1193Provider`, check the [`window` Polyfill](https://viem.sh/docs/typescript#window-polyfill){target=\\_blank} reference for more information."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 5, "depth": 2, "title": "Set Up the Smart Contract Interface", "anchor": "set-up-the-smart-contract-interface", "start_char": 4571, "end_char": 7083, "estimated_token_count": 522, "token_estimator": "heuristic-v1", "text": "## Set Up the Smart Contract Interface\n\nFor this dApp, you'll use a simple [Storage contract](/tutorials/smart-contracts/launch-your-first-project/create-contracts){target=\\_blank} that's already deployed in the Polkadot Hub TestNet: `0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f`. To interact with it, you need to define the contract interface.\n\nCreate a folder called `abis` at the root of your project, then create a file named `Storage.json` and paste the corresponding ABI (Application Binary Interface) of the Storage contract. You can copy and paste the following:\n\n??? code \"Storage.sol ABI\"\n ```json title=\"Storage.json\"\n [\n {\n \"inputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"_newNumber\",\n \"type\": \"uint256\"\n }\n ],\n \"name\": \"setNumber\",\n \"outputs\": [],\n \"stateMutability\": \"nonpayable\",\n \"type\": \"function\"\n },\n {\n \"inputs\": [],\n \"name\": \"storedNumber\",\n \"outputs\": [\n {\n \"internalType\": \"uint256\",\n \"name\": \"\",\n \"type\": \"uint256\"\n }\n ],\n \"stateMutability\": \"view\",\n \"type\": \"function\"\n }\n ]\n ```\n\nNext, create a file called `utils/contract.ts`:\n\n```typescript title=\"contract.ts\"\nimport { getContract } from 'viem';\nimport { publicClient, getWalletClient } from './viem';\nimport StorageABI from '../../abis/Storage.json';\n\nexport const CONTRACT_ADDRESS = '0x58053f0e8ede1a47a1af53e43368cd04ddcaf66f';\nexport const CONTRACT_ABI = StorageABI;\n\n// Create a function to get a contract instance for reading\nexport const getContractInstance = () => {\n return getContract({\n address: CONTRACT_ADDRESS,\n abi: CONTRACT_ABI,\n client: publicClient,\n });\n};\n\n// Create a function to get a contract instance with a signer for writing\nexport const getSignedContract = async () => {\n const walletClient = await getWalletClient();\n return getContract({\n address: CONTRACT_ADDRESS,\n abi: CONTRACT_ABI,\n client: walletClient,\n });\n};\n```\n\nThis file defines the contract address, ABI, and functions to create a viem [contract instance](https://viem.sh/docs/contract/getContract#contract-instances){target=\\_blank} for reading and writing operations. viem's contract utilities ensure a more efficient and type-safe interaction with smart contracts."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 6, "depth": 2, "title": "Create the Wallet Connection Component", "anchor": "create-the-wallet-connection-component", "start_char": 7083, "end_char": 14170, "estimated_token_count": 1627, "token_estimator": "heuristic-v1", "text": "## Create the Wallet Connection Component\n\nNow, let's create a component to handle wallet connections. Create a new file called `components/WalletConnect.tsx`:\n\n```typescript title=\"WalletConnect.tsx\"\n\"use client\";\n\nimport React, { useState, useEffect } from \"react\";\nimport { passetHub } from \"../utils/viem\";\n\ninterface WalletConnectProps {\n onConnect: (account: string) => void;\n}\n\nconst WalletConnect: React.FC = ({ onConnect }) => {\n const [account, setAccount] = useState(null);\n const [chainId, setChainId] = useState(null);\n const [error, setError] = useState(null);\n\n useEffect(() => {\n // Check if user already has an authorized wallet connection\n const checkConnection = async () => {\n if (typeof window !== 'undefined' && window.ethereum) {\n try {\n // eth_accounts doesn't trigger the wallet popup\n const accounts = await window.ethereum.request({\n method: 'eth_accounts',\n }) as string[];\n \n if (accounts.length > 0) {\n setAccount(accounts[0]);\n const chainIdHex = await window.ethereum.request({\n method: 'eth_chainId',\n }) as string;\n setChainId(parseInt(chainIdHex, 16));\n onConnect(accounts[0]);\n }\n } catch (err) {\n console.error('Error checking connection:', err);\n setError('Failed to check wallet connection');\n }\n }\n };\n\n checkConnection();\n\n if (typeof window !== 'undefined' && window.ethereum) {\n // Setup wallet event listeners\n window.ethereum.on('accountsChanged', (accounts: string[]) => {\n setAccount(accounts[0] || null);\n if (accounts[0]) onConnect(accounts[0]);\n });\n\n window.ethereum.on('chainChanged', (chainIdHex: string) => {\n setChainId(parseInt(chainIdHex, 16));\n });\n }\n\n return () => {\n // Cleanup event listeners\n if (typeof window !== 'undefined' && window.ethereum) {\n window.ethereum.removeListener('accountsChanged', () => {});\n window.ethereum.removeListener('chainChanged', () => {});\n }\n };\n }, [onConnect]);\n\n const connectWallet = async () => {\n if (typeof window === 'undefined' || !window.ethereum) {\n setError(\n 'MetaMask not detected! Please install MetaMask to use this dApp.'\n );\n return;\n }\n\n try {\n // eth_requestAccounts triggers the wallet popup\n const accounts = await window.ethereum.request({\n method: 'eth_requestAccounts',\n }) as string[];\n \n setAccount(accounts[0]);\n\n const chainIdHex = await window.ethereum.request({\n method: 'eth_chainId',\n }) as string;\n \n const currentChainId = parseInt(chainIdHex, 16);\n setChainId(currentChainId);\n\n // Prompt user to switch networks if needed\n if (currentChainId !== passetHub.id) {\n await switchNetwork();\n }\n\n onConnect(accounts[0]);\n } catch (err) {\n console.error('Error connecting to wallet:', err);\n setError('Failed to connect wallet');\n }\n };\n\n const switchNetwork = async () => {\n console.log('Switch network')\n try {\n await window.ethereum.request({\n method: 'wallet_switchEthereumChain',\n params: [{ chainId: `0x${passetHub.id.toString(16)}` }],\n });\n } catch (switchError: any) {\n // Error 4902 means the chain hasn't been added to MetaMask\n if (switchError.code === 4902) {\n try {\n await window.ethereum.request({\n method: 'wallet_addEthereumChain',\n params: [\n {\n chainId: `0x${passetHub.id.toString(16)}`,\n chainName: passetHub.name,\n rpcUrls: [passetHub.rpcUrls.default.http[0]],\n nativeCurrency: {\n name: passetHub.nativeCurrency.name,\n symbol: passetHub.nativeCurrency.symbol,\n decimals: passetHub.nativeCurrency.decimals,\n },\n },\n ],\n });\n } catch (addError) {\n setError('Failed to add network to wallet');\n }\n } else {\n setError('Failed to switch network');\n }\n }\n };\n\n // UI-only disconnection - MetaMask doesn't support programmatic disconnection\n const disconnectWallet = () => {\n setAccount(null);\n };\n\n return (\n
\n {error &&

{error}

}\n\n {!account ? (\n \n Connect Wallet\n \n ) : (\n
\n \n {`${account.substring(0, 6)}...${account.substring(38)}`}\n \n \n Disconnect\n \n {chainId !== passetHub.id && (\n \n Switch to Passet Hub\n \n )}\n
\n )}\n
\n );\n};\n\nexport default WalletConnect;\n```\n\nThis component handles connecting to the wallet, switching networks if necessary, and keeping track of the connected account. It provides a button for users to connect their wallet and displays the connected account address once connected.\n\nTo use this component in your dApp, replace the existing boilerplate in `app/page.tsx` with the following code:\n\n```typescript title=\"page.tsx\"\n\nimport { useState } from \"react\";\nimport WalletConnect from \"./components/WalletConnect\";\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount: string) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Viem dApp - Passet Hub Smart Contracts\n

\n \n
\n );\n}\n```\n\nNow you're ready to run your dApp. From your project directory, execute:\n\n```bash\nnpm run dev\n```\n\nNavigate to `http://localhost:3000` in your browser, and you should see your dApp with the wallet connection button, the stored number display, and the form to update the number.\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-2.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 7, "depth": 2, "title": "Create the Read Contract Component", "anchor": "create-the-read-contract-component", "start_char": 14170, "end_char": 17647, "estimated_token_count": 854, "token_estimator": "heuristic-v1", "text": "## Create the Read Contract Component\n\nNow, let's create a component to read data from the contract. Create a file called `components/ReadContract.tsx`:\n\n```typescript title=\"ReadContract.tsx\"\n'use client';\n\nimport React, { useState, useEffect } from 'react';\nimport { publicClient } from '../utils/viem';\nimport { CONTRACT_ADDRESS, CONTRACT_ABI } from '../utils/contract';\n\nconst ReadContract: React.FC = () => {\n const [storedNumber, setStoredNumber] = useState(null);\n const [loading, setLoading] = useState(true);\n const [error, setError] = useState(null);\n\n useEffect(() => {\n // Function to read data from the blockchain\n const fetchData = async () => {\n try {\n setLoading(true);\n // Call the smart contract's storedNumber function\n const number = await publicClient.readContract({\n address: CONTRACT_ADDRESS,\n abi: CONTRACT_ABI,\n functionName: 'storedNumber',\n args: [],\n }) as bigint;\n\n setStoredNumber(number.toString());\n setError(null);\n } catch (err) {\n console.error('Error fetching stored number:', err);\n setError('Failed to fetch data from the contract');\n } finally {\n setLoading(false);\n }\n };\n\n fetchData();\n\n // Poll for updates every 10 seconds to keep UI in sync with blockchain\n const interval = setInterval(fetchData, 10000);\n\n // Clean up interval on component unmount\n return () => clearInterval(interval);\n }, []);\n\n return (\n
\n

Contract Data

\n {loading ? (\n
\n
\n
\n ) : error ? (\n

{error}

\n ) : (\n
\n

\n Stored Number: {storedNumber}\n

\n
\n )}\n
\n );\n};\n\nexport default ReadContract;\n```\n\nThis component reads the `storedNumber` value from the contract and displays it to the user. It also sets up a polling interval to refresh the data periodically, ensuring that the UI stays in sync with the blockchain state.\n\nTo reflect this change in your dApp, incorporate this component into the `app/page.tsx` file.\n\n```typescript title=\"page.tsx\"\n\nimport { useState } from \"react\";\nimport WalletConnect from \"./components/WalletConnect\";\nimport ReadContract from \"./components/ReadContract\";\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount: string) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Viem dApp - Passet Hub Smart Contracts\n

\n \n \n
\n );\n}\n```\n\nAnd you will see in your browser:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-3.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 8, "depth": 2, "title": "Create the Write Contract Component", "anchor": "create-the-write-contract-component", "start_char": 17647, "end_char": 25635, "estimated_token_count": 1808, "token_estimator": "heuristic-v1", "text": "## Create the Write Contract Component\n\nFinally, let's create a component that allows users to update the stored number. Create a file called `components/WriteContract.tsx`:\n\n```typescript title=\"WriteContract.tsx\"\n\"use client\";\n\nimport React, { useState, useEffect } from \"react\";\nimport { publicClient, getWalletClient } from \"../utils/viem\";\nimport { CONTRACT_ADDRESS, CONTRACT_ABI } from \"../utils/contract\";\n\ninterface WriteContractProps {\n account: string | null;\n}\n\nconst WriteContract: React.FC = ({ account }) => {\n const [newNumber, setNewNumber] = useState(\"\");\n const [status, setStatus] = useState<{\n type: string | null;\n message: string;\n }>({\n type: null,\n message: \"\",\n });\n const [isSubmitting, setIsSubmitting] = useState(false);\n const [isCorrectNetwork, setIsCorrectNetwork] = useState(true);\n\n // Check if the account is on the correct network\n useEffect(() => {\n const checkNetwork = async () => {\n if (!account) return;\n\n try {\n // Get the chainId from the public client\n const chainId = await publicClient.getChainId();\n\n // Get the user's current chainId from their wallet\n const walletClient = await getWalletClient();\n if (!walletClient) return;\n\n const walletChainId = await walletClient.getChainId();\n\n // Check if they match\n setIsCorrectNetwork(chainId === walletChainId);\n } catch (err) {\n console.error(\"Error checking network:\", err);\n setIsCorrectNetwork(false);\n }\n };\n\n checkNetwork();\n }, [account]);\n\n const handleSubmit = async (e: React.FormEvent) => {\n e.preventDefault();\n\n // Validation checks\n if (!account) {\n setStatus({ type: \"error\", message: \"Please connect your wallet first\" });\n return;\n }\n\n if (!isCorrectNetwork) {\n setStatus({\n type: \"error\",\n message: \"Please switch to the correct network in your wallet\",\n });\n return;\n }\n\n if (!newNumber || isNaN(Number(newNumber))) {\n setStatus({ type: \"error\", message: \"Please enter a valid number\" });\n return;\n }\n\n try {\n setIsSubmitting(true);\n setStatus({ type: \"info\", message: \"Initiating transaction...\" });\n\n // Get wallet client for transaction signing\n const walletClient = await getWalletClient();\n\n if (!walletClient) {\n setStatus({ type: \"error\", message: \"Wallet client not available\" });\n return;\n }\n\n // Check if account matches\n if (\n walletClient.account?.address.toLowerCase() !== account.toLowerCase()\n ) {\n setStatus({\n type: \"error\",\n message:\n \"Connected wallet account doesn't match the selected account\",\n });\n return;\n }\n\n // Prepare transaction and wait for user confirmation in wallet\n setStatus({\n type: \"info\",\n message: \"Please confirm the transaction in your wallet...\",\n });\n\n // Simulate the contract call first\n console.log('newNumber', newNumber);\n const { request } = await publicClient.simulateContract({\n address: CONTRACT_ADDRESS,\n abi: CONTRACT_ABI,\n functionName: \"setNumber\",\n args: [BigInt(newNumber)],\n account: walletClient.account,\n });\n\n // Send the transaction with wallet client\n const hash = await walletClient.writeContract(request);\n\n // Wait for transaction to be mined\n setStatus({\n type: \"info\",\n message: \"Transaction submitted. Waiting for confirmation...\",\n });\n\n const receipt = await publicClient.waitForTransactionReceipt({\n hash,\n });\n\n setStatus({\n type: \"success\",\n message: `Transaction confirmed! Transaction hash: ${receipt.transactionHash}`,\n });\n\n setNewNumber(\"\");\n } catch (err: any) {\n console.error(\"Error updating number:\", err);\n\n // Handle specific errors\n if (err.code === 4001) {\n // User rejected transaction\n setStatus({ type: \"error\", message: \"Transaction rejected by user.\" });\n } else if (err.message?.includes(\"Account not found\")) {\n // Account not found on the network\n setStatus({\n type: \"error\",\n message:\n \"Account not found on current network. Please check your wallet is connected to the correct network.\",\n });\n } else if (err.message?.includes(\"JSON is not a valid request object\")) {\n // JSON error - specific to your current issue\n setStatus({\n type: \"error\",\n message:\n \"Invalid request format. Please try again or contact support.\",\n });\n } else {\n // Other errors\n setStatus({\n type: \"error\",\n message: `Error: ${err.message || \"Failed to send transaction\"}`,\n });\n }\n } finally {\n setIsSubmitting(false);\n }\n };\n\n return (\n
\n

Update Stored Number

\n\n {!isCorrectNetwork && account && (\n
\n ⚠️ You are not connected to the correct network. Please switch\n networks in your wallet.\n
\n )}\n\n {status.message && (\n \n {status.message}\n
\n )}\n\n
\n setNewNumber(e.target.value)}\n disabled={isSubmitting || !account}\n className=\"w-full p-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-pink-400\"\n />\n \n {isSubmitting ? \"Updating...\" : \"Update\"}\n \n \n\n {!account && (\n

\n Connect your wallet to update the stored number.\n

\n )}\n \n );\n};\n\nexport default WriteContract;\n```\n\nThis component allows users to input a new number and send a transaction to update the value stored in the contract. It provides appropriate feedback during each step of the transaction process and handles error scenarios.\n\nUpdate the `app/page.tsx` file to integrate all components:\n\n```typescript title=\"page.tsx\"\n\"use client\";\n\nimport { useState } from \"react\";\nimport WalletConnect from \"./components/WalletConnect\";\nimport ReadContract from \"./components/ReadContract\";\nimport WriteContract from \"./components/WriteContract\";\n\nexport default function Home() {\n const [account, setAccount] = useState(null);\n\n const handleConnect = (connectedAccount: string) => {\n setAccount(connectedAccount);\n };\n\n return (\n
\n

\n Viem dApp - Passet Hub Smart Contracts\n

\n \n \n \n
\n );\n}\n```\nAfter that, you will see:\n\n![](/images/tutorials/smart-contracts/launch-your-first-project/create-dapp-viem/create-dapp-viem-4.webp)"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 9, "depth": 2, "title": "How It Works", "anchor": "how-it-works", "start_char": 25635, "end_char": 26779, "estimated_token_count": 217, "token_estimator": "heuristic-v1", "text": "## How It Works\n\nLet's examine how the dApp interacts with the blockchain:\n\n1. Wallet connection: \n\n - The `WalletConnect` component uses the browser's Ethereum provider (MetaMask) to connect to the user's wallet.\n - It handles network switching to ensure the user is connected to the Polkadot Hub TestNet.\n - Once connected, it provides the user's account address to the parent component.\n\n2. Reading data:\n\n - The `ReadContract` component uses viem's `readContract` function to call the `storedNumber` view function.\n - It periodically polls for updates to keep the UI in sync with the blockchain state.\n - The component displays a loading indicator while fetching data and handles error states.\n\n3. Writing data:\n\n - The `WriteContract` component uses viem's `writeContract` function to send a transaction to the `setNumber` function.\n - It ensures the wallet is connected before allowing a transaction.\n - The component shows detailed feedback during transaction submission and confirmation.\n - After a successful transaction, the value displayed in the `ReadContract` component will update on the next poll."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 10, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 26779, "end_char": 27636, "estimated_token_count": 175, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've successfully built a fully functional dApp that interacts with a smart contract on Polkadot Hub using viem and Next.js. Your application can now:\n\n- Connect to a user's wallet and handle network switching.\n- Read data from a smart contract and keep it updated.\n- Write data to the blockchain through transactions.\n\nThese fundamental skills provide the foundation for building more complex dApps on Polkadot Hub. With this knowledge, you can extend your application to interact with more sophisticated smart contracts and create advanced user interfaces.\n\nTo get started right away with a working example, you can clone the repository and navigate to the implementation:\n\n```\ngit clone https://github.com/polkadot-developers/polkavm-storage-contract-dapps.git -b v0.0.2\ncd polkavm-storage-contract-dapps/viem-dapp\n```"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-create-dapp-viem", "page_title": "Create a dApp With Viem", "index": 11, "depth": 2, "title": "Where to Go Next", "anchor": "where-to-go-next", "start_char": 27636, "end_char": 27950, "estimated_token_count": 81, "token_estimator": "heuristic-v1", "text": "## Where to Go Next\n\n
\n\n- Guide __Create a dApp with Wagmi__\n\n ---\n\n Learn how to build a decentralized application by using the Wagmi framework.\n\n [:octicons-arrow-right-24: Get Started](/develop/smart-contracts/libraries/wagmi)\n\n
"} {"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 0, "depth": 2, "title": "Introduction", "anchor": "introduction", "start_char": 202, "end_char": 841, "estimated_token_count": 136, "token_estimator": "heuristic-v1", "text": "## Introduction\n\nAfter creating a smart contract, the next crucial steps are testing and deployment. Proper testing ensures your contract behaves as expected, while deployment makes your contract available on the blockchain. This tutorial will guide you through using Hardhat, a popular development environment, to test and deploy the `Storage.sol` contract you created in the [Create a Smart Contract](/tutorials/smart-contracts/launch-your-first-project/create-contracts/){target=\\_blank} tutorial. For more information about Hardhat usage, check the [Hardhat guide](/develop/smart-contracts/dev-environments/hardhat/){target=\\_blank}."} {"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 1, "depth": 2, "title": "Prerequisites", "anchor": "prerequisites", "start_char": 841, "end_char": 1367, "estimated_token_count": 147, "token_estimator": "heuristic-v1", "text": "## Prerequisites\n\nBefore starting, make sure you have:\n\n- The [`Storage.sol` contract](/tutorials/smart-contracts/launch-your-first-project/create-contracts/#create-the-smart-contract){target=\\_blank} created in the previous tutorial.\n- [Node.js](https://nodejs.org/){target=\\_blank} (v16.0.0 or later) and npm installed.\n- Basic understanding of JavaScript for writing tests.\n- Some PAS test tokens to cover transaction fees (obtained from the [Polkadot faucet](https://faucet.polkadot.io/?parachain=1111){target=\\_blank})."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 2, "depth": 2, "title": "Setting Up the Development Environment", "anchor": "setting-up-the-development-environment", "start_char": 1367, "end_char": 3576, "estimated_token_count": 504, "token_estimator": "heuristic-v1", "text": "## Setting Up the Development Environment\n\nLet's start by setting up Hardhat for your Storage contract project:\n\n1. Create a new directory for your project and navigate into it:\n\n ```bash\n mkdir storage-hardhat\n cd storage-hardhat\n ```\n\n2. Initialize a new npm project:\n\n ```bash\n npm init -y\n ```\n\n3. Install `hardhat-polkadot` and all required plugins:\n\n ```bash\n npm install --save-dev @parity/hardhat-polkadot@0.1.9 solc@0.8.28\n ```\n\n For dependencies compatibility, ensure to install the `@nomicfoundation/hardhat-toolbox` dependency with the `--force` flag:\n\n ```bash\n npm install --force @nomicfoundation/hardhat-toolbox \n ```\n\n5. Initialize a Hardhat project:\n\n ```bash\n npx hardhat-polkadot init\n ```\n\n Select **Create an empty hardhat.config.js** when prompted.\n\n6. Configure Hardhat by updating the `hardhat.config.js` file:\n\n ```javascript title=\"hardhat.config.js\"\n \n ```\n\n Ensure that `INSERT_PATH_TO_SUBSTRATE_NODE` and `INSERT_PATH_TO_ETH_RPC_ADAPTER` are replaced with the proper paths to the compiled binaries. \n\n If you need to build these binaries, follow the [Installation](/develop/smart-contracts/local-development-node#install-the-substrate-node-and-eth-rpc-adapter){target=\\_blank} section on the Local Development Node page.\n\n The configuration also defines two network settings: \n\n - **`localNode`**: Runs a PolkaVM instance on `http://127.0.0.1:8545` for local development and testing.\n - **`passetHub`**: Connects to the the Polkadot Hub TestNet network using a predefined RPC URL and a private key stored in environment variables.\n\n7. Export your private key and save it in your Hardhat environment:\n\n ```bash\n npx hardhat vars set PRIVATE_KEY \"INSERT_PRIVATE_KEY\"\n ```\n\n Replace `INSERT_PRIVATE_KEY` with your actual private key. \n \n For further details on private key exportation, refer to the article [How to export an account's private key](https://support.metamask.io/configure/accounts/how-to-export-an-accounts-private-key/){target=\\_blank}.\n\n !!! warning\n Keep your private key safe, and never share it with anyone. If it is compromised, your funds can be stolen."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 3, "depth": 2, "title": "Adding the Smart Contract", "anchor": "adding-the-smart-contract", "start_char": 3576, "end_char": 4951, "estimated_token_count": 293, "token_estimator": "heuristic-v1", "text": "## Adding the Smart Contract\n\n1. Create a new folder called `contracts` and create a `Storage.sol` file. Add the contract code from the previous tutorial:\n\n ```solidity title=\"Storage.sol\"\n // SPDX-License-Identifier: MIT\n pragma solidity ^0.8.28;\n\n contract Storage {\n // State variable to store our number\n uint256 private number;\n\n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n\n // Function to store a new number\n function store(uint256 newNumber) public {\n number = newNumber;\n emit NumberChanged(newNumber);\n }\n\n // Function to retrieve the stored number\n function retrieve() public view returns (uint256) {\n return number;\n }\n }\n ```\n\n2. Compile the contract:\n\n ```bash\n npx hardhat compile\n ```\n\n3. If successful, you will see the following output in your terminal:\n\n
\n npx hardhat compile\n Compiling 1 Solidity file\n Successfully compiled 1 Solidity file\n
\n\nAfter compilation, the `artifacts-pvm` and `cache-pvm` folders, containing the metadata and binary files of your compiled contract, will be created in the root of your project."} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 4, "depth": 2, "title": "Writing Tests", "anchor": "writing-tests", "start_char": 4951, "end_char": 11980, "estimated_token_count": 1480, "token_estimator": "heuristic-v1", "text": "## Writing Tests\n\nTesting is a critical part of smart contract development. Hardhat makes it easy to write tests in JavaScript using frameworks like [Mocha](https://mochajs.org/){target=\\_blank} and [Chai](https://www.chaijs.com/){target=\\_blank}.\n\n1. Create a folder for testing called `test`. Inside that directory, create a file named `Storage.js` and add the following code:\n\n ```javascript title=\"Storage.js\" \n const { expect } = require('chai');\n const { ethers } = require('hardhat');\n\n describe('Storage', function () {\n let storage;\n let owner;\n let addr1;\n\n beforeEach(async function () {\n // Get signers\n [owner, addr1] = await ethers.getSigners();\n\n // Deploy the Storage contract\n const Storage = await ethers.getContractFactory('Storage');\n storage = await Storage.deploy();\n await storage.waitForDeployment();\n });\n\n describe('Basic functionality', function () {\n // Add your logic here\n });\n });\n ```\n\n The `beforeEach` hook ensures stateless contract execution by redeploying a fresh instance of the Storage contract before each test case. This approach guarantees that each test starts with a clean and independent contract state by using `ethers.getSigners()` to obtain test accounts and `ethers.getContractFactory('Storage').deploy()` to create a new contract instance.\n\n Now, you can add custom unit tests to check your contract functionality. Some example tests are available below:\n\n 1. **Initial state verification**: Ensures that the contract starts with a default value of zero, which is a fundamental expectation for the `Storage.sol` contract.\n\n ```javascript title=\"Storage.js\"\n it('Should return 0 initially', async function () {\n expect(await storage.retrieve()).to.equal(0);\n });\n ```\n\n Explanation:\n\n - Checks the initial state of the contract.\n - Verifies that a newly deployed contract has a default value of 0.\n - Confirms the `retrieve()` method works correctly for a new contract.\n\n 2. **Value storage test**: Validate the core functionality of storing and retrieving a value in the contract.\n\n ```javascript title=\"Storage.js\"\n it('Should update when store is called', async function () {\n const testValue = 42;\n // Store a value\n await storage.store(testValue);\n // Check if the value was updated\n expect(await storage.retrieve()).to.equal(testValue);\n });\n ```\n\n Explanation:\n\n - Demonstrates the ability to store a specific value.\n - Checks that the stored value can be retrieved correctly.\n - Verifies the basic write and read functionality of the contract.\n\n 3. **Event emission verification**: Confirm that the contract emits the correct event when storing a value, which is crucial for off-chain tracking.\n\n ```javascript title=\"Storage.js\"\n it('Should emit an event when storing a value', async function () {\n const testValue = 100;\n // Check if the NumberChanged event is emitted with the correct value\n await expect(storage.store(testValue))\n .to.emit(storage, 'NumberChanged')\n .withArgs(testValue);\n });\n ```\n\n Explanation:\n\n - Ensures the `NumberChanged` event is emitted during storage.\n - Verifies that the event contains the correct stored value.\n - Validates the contract's event logging mechanism.\n\n 4. **Sequential value storage test**: Check the contract's ability to store multiple values sequentially and maintain the most recent value.\n\n ```javascript title=\"Storage.js\"\n it('Should allow storing sequentially increasing values', async function () {\n const values = [10, 20, 30, 40];\n\n for (const value of values) {\n await storage.store(value);\n expect(await storage.retrieve()).to.equal(value);\n }\n });\n ```\n\n Explanation:\n\n - Verifies that multiple values can be stored in sequence.\n - Confirms that each new store operation updates the contract's state.\n - Demonstrates the contract's ability always to reflect the most recently stored value.\n\n The complete `test/Storage.js` should look like this:\n\n ???--- code \"View complete script\"\n ```javascript title=\"Storage.js\"\n const { expect } = require('chai');\n const { ethers } = require('hardhat');\n\n describe('Storage', function () {\n let storage;\n let owner;\n let addr1;\n\n beforeEach(async function () {\n // Get signers\n [owner, addr1] = await ethers.getSigners();\n\n // Deploy the Storage contract\n const Storage = await ethers.getContractFactory('Storage');\n storage = await Storage.deploy();\n await storage.waitForDeployment();\n });\n\n describe('Basic functionality', function () {\n it('Should return 0 initially', async function () {\n expect(await storage.retrieve()).to.equal(0);\n });\n\n it('Should update when store is called', async function () {\n const testValue = 42;\n // Store a value\n await storage.store(testValue);\n // Check if the value was updated\n expect(await storage.retrieve()).to.equal(testValue);\n });\n\n it('Should emit an event when storing a value', async function () {\n const testValue = 100;\n // Check if the NumberChanged event is emitted with the correct value\n await expect(storage.store(testValue))\n .to.emit(storage, 'NumberChanged')\n .withArgs(testValue);\n });\n\n it('Should allow storing sequentially increasing values', async function () {\n const values = [10, 20, 30, 40];\n\n for (const value of values) {\n await storage.store(value);\n expect(await storage.retrieve()).to.equal(value);\n }\n });\n });\n });\n ```\n\n2. Run the tests:\n\n ```bash\n npx hardhat test\n ```\n\n3. After running the above command, you will see the output showing that all tests have passed:\n\n
\n npx hardhat test\n Storage\n Basic functionality\n ✔ Should return 0 initially\n ✔ Should update when store is called (1126ms)\n ✔ Should emit an event when storing a value (1131ms)\n ✔ Should allow storing sequentially increasing values (12477ms)\n 4 passing (31s) \n
"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 5, "depth": 2, "title": "Deploying with Ignition", "anchor": "deploying-with-ignition", "start_char": 11980, "end_char": 14983, "estimated_token_count": 731, "token_estimator": "heuristic-v1", "text": "## Deploying with Ignition\n\n[Hardhat's Ignition](https://hardhat.org/ignition/docs/getting-started#overview){target=\\_blank} is a deployment system designed to make deployments predictable and manageable. Let's create a deployment script:\n\n1. Create a new folder called`ignition/modules`. Add a new file named `StorageModule.js` with the following logic:\n\n ```javascript title=\"StorageModule.js\"\n \n ```\n\n2. Deploy to the local network:\n\n 1. First, start a local node:\n\n ```bash\n npx hardhat node\n ```\n\n 2. Then, in a new terminal window, deploy the contract:\n\n ```bash\n npx hardhat ignition deploy ./ignition/modules/StorageModule.js --network localNode\n ```\n\n 3. If successful, output similar to the following will display in your terminal:\n\n
\n npx hardhat ignition deploy ./ignition/modules/Storage.js --network localNode\n ✔ Confirm deploy to network localNode (420420422)? … yes\n \n Hardhat Ignition 🚀\n \n Deploying [ StorageModule ]\n \n Batch #1\n Executed StorageModule#Storage\n \n [ StorageModule ] successfully deployed 🚀\n \n Deployed Addresses\n \n StorageModule#Storage - 0xc01Ee7f10EA4aF4673cFff62710E1D7792aBa8f3\n
\n\n3. Deploy to the Polkadot Hub TestNet:\n\n 1. Make sure your account has enough PAS tokens for gas fees, then run:\n\n ```bash\n npx hardhat ignition deploy ./ignition/modules/StorageModule.js --network passetHub\n ```\n\n 2. After deployment, you'll see the contract address in the console output. Save this address for future interactions.\n\n
\n npx hardhat ignition deploy ./ignition/modules/Storage.js --network passetHub\n ✔ Confirm deploy to network localNode (420420422)? … yes\n \n Hardhat Ignition 🚀\n \n Deploying [ StorageModule ]\n \n Batch #1\n Executed StorageModule#Storage\n \n [ StorageModule ] successfully deployed 🚀\n \n Deployed Addresses\n \n StorageModule#Storage - 0xE8693cE64b294E26765573398C7Ca5C700E9C85c\n
"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 6, "depth": 2, "title": "Interacting with Your Deployed Contract", "anchor": "interacting-with-your-deployed-contract", "start_char": 14983, "end_char": 16937, "estimated_token_count": 460, "token_estimator": "heuristic-v1", "text": "## Interacting with Your Deployed Contract\n\nTo interact with your deployed contract:\n\n1. Create a new folder named `scripts` and add the `interact.js` with the following content:\n\n ```javascript title=\"interact.js\"\n const hre = require('hardhat');\n\n async function main() {\n // Replace with your deployed contract address\n const contractAddress = 'INSERT_DEPLOYED_CONTRACT_ADDRESS';\n\n // Get the contract instance\n const Storage = await hre.ethers.getContractFactory('Storage');\n const storage = await Storage.attach(contractAddress);\n\n // Get current value\n const currentValue = await storage.retrieve();\n console.log('Current stored value:', currentValue.toString());\n\n // Store a new value\n const newValue = 42;\n console.log(`Storing new value: ${newValue}...`);\n const tx = await storage.store(newValue);\n\n // Wait for transaction to be mined\n await tx.wait();\n console.log('Transaction confirmed');\n\n // Get updated value\n const updatedValue = await storage.retrieve();\n console.log('Updated stored value:', updatedValue.toString());\n }\n\n main()\n .then(() => process.exit(0))\n .catch((error) => {\n console.error(error);\n process.exit(1);\n });\n ```\n\n Ensure that `INSERT_DEPLOYED_CONTRACT_ADDRESS` is replaced with the value obtained in the previous step.\n\n2. Run the interaction script:\n\n ```bash\n npx hardhat run scripts/interact.js --network passetHub\n ```\n\n3. If successful, the terminal will show the following output:\n\n
\n npx hardhat run scripts/interact.js --network passetHub\n Current stored value: 0\n Storing new value: 42...\n Transaction confirmed\n Updated stored value: 42\n
"} -{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 7, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 16937, "end_char": 17542, "estimated_token_count": 122, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've successfully set up a Hardhat development environment, written comprehensive tests for your Storage contract, and deployed it to local and Polkadot Hub TestNet networks. This tutorial covered essential steps in smart contract development, including configuration, testing, deployment, and interaction.\n\nTo get started with a working example right away, you can clone the repository and navigate to the project directory:\n\n```bash\ngit clone https://github.com/polkadot-developers/polkavm-hardhat-examples.git -b v0.0.8\ncd polkavm-hardhat-examples/storage-hardhat\n```"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 2, "depth": 2, "title": "Setting Up the Development Environment", "anchor": "setting-up-the-development-environment", "start_char": 1367, "end_char": 4508, "estimated_token_count": 706, "token_estimator": "heuristic-v1", "text": "## Setting Up the Development Environment\n\nLet's start by setting up Hardhat for your Storage contract project:\n\n1. Create a new directory for your project and navigate into it:\n\n ```bash\n mkdir storage-hardhat\n cd storage-hardhat\n ```\n\n2. Initialize a new npm project:\n\n ```bash\n npm init -y\n ```\n\n3. Install `hardhat-polkadot` and all required plugins:\n\n ```bash\n npm install --save-dev @parity/hardhat-polkadot@0.1.9 solc@0.8.28\n ```\n\n For dependencies compatibility, ensure to install the `@nomicfoundation/hardhat-toolbox` dependency with the `--force` flag:\n\n ```bash\n npm install --force @nomicfoundation/hardhat-toolbox \n ```\n\n5. Initialize a Hardhat project:\n\n ```bash\n npx hardhat-polkadot init\n ```\n\n Select **Create an empty hardhat.config.js** when prompted.\n\n6. Configure Hardhat by updating the `hardhat.config.js` file:\n\n ```javascript title=\"hardhat.config.js\"\n require(\"@nomicfoundation/hardhat-toolbox\");\n\n require(\"@parity/hardhat-polkadot\");\n\n const { vars } = require(\"hardhat/config\");\n\n /** @type import('hardhat/config').HardhatUserConfig */\n module.exports = {\n solidity: \"0.8.28\",\n resolc: {\n compilerSource: \"npm\",\n },\n networks: {\n hardhat: {\n polkavm: true,\n nodeConfig: {\n nodeBinaryPath: 'INSERT_PATH_TO_SUBSTRATE_NODE',\n rpcPort: 8000,\n dev: true,\n },\n adapterConfig: {\n adapterBinaryPath: 'INSERT_PATH_TO_ETH_RPC_ADAPTER',\n dev: true,\n },\n },\n localNode: {\n polkavm: true,\n url: `http://127.0.0.1:8545`,\n },\n passetHub: {\n polkavm: true,\n url: 'https://testnet-passet-hub-eth-rpc.polkadot.io',\n accounts: [vars.get(\"PRIVATE_KEY\")],\n },\n },\n };\n ```\n\n Ensure that `INSERT_PATH_TO_SUBSTRATE_NODE` and `INSERT_PATH_TO_ETH_RPC_ADAPTER` are replaced with the proper paths to the compiled binaries. \n\n If you need to build these binaries, follow the [Installation](/develop/smart-contracts/local-development-node#install-the-substrate-node-and-eth-rpc-adapter){target=\\_blank} section on the Local Development Node page.\n\n The configuration also defines two network settings: \n\n - **`localNode`**: Runs a PolkaVM instance on `http://127.0.0.1:8545` for local development and testing.\n - **`passetHub`**: Connects to the the Polkadot Hub TestNet network using a predefined RPC URL and a private key stored in environment variables.\n\n7. Export your private key and save it in your Hardhat environment:\n\n ```bash\n npx hardhat vars set PRIVATE_KEY \"INSERT_PRIVATE_KEY\"\n ```\n\n Replace `INSERT_PRIVATE_KEY` with your actual private key. \n \n For further details on private key exportation, refer to the article [How to export an account's private key](https://support.metamask.io/configure/accounts/how-to-export-an-accounts-private-key/){target=\\_blank}.\n\n !!! warning\n Keep your private key safe, and never share it with anyone. If it is compromised, your funds can be stolen."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 3, "depth": 2, "title": "Adding the Smart Contract", "anchor": "adding-the-smart-contract", "start_char": 4508, "end_char": 5883, "estimated_token_count": 293, "token_estimator": "heuristic-v1", "text": "## Adding the Smart Contract\n\n1. Create a new folder called `contracts` and create a `Storage.sol` file. Add the contract code from the previous tutorial:\n\n ```solidity title=\"Storage.sol\"\n // SPDX-License-Identifier: MIT\n pragma solidity ^0.8.28;\n\n contract Storage {\n // State variable to store our number\n uint256 private number;\n\n // Event to notify when the number changes\n event NumberChanged(uint256 newNumber);\n\n // Function to store a new number\n function store(uint256 newNumber) public {\n number = newNumber;\n emit NumberChanged(newNumber);\n }\n\n // Function to retrieve the stored number\n function retrieve() public view returns (uint256) {\n return number;\n }\n }\n ```\n\n2. Compile the contract:\n\n ```bash\n npx hardhat compile\n ```\n\n3. If successful, you will see the following output in your terminal:\n\n
\n npx hardhat compile\n Compiling 1 Solidity file\n Successfully compiled 1 Solidity file\n
\n\nAfter compilation, the `artifacts-pvm` and `cache-pvm` folders, containing the metadata and binary files of your compiled contract, will be created in the root of your project."} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 4, "depth": 2, "title": "Writing Tests", "anchor": "writing-tests", "start_char": 5883, "end_char": 12912, "estimated_token_count": 1480, "token_estimator": "heuristic-v1", "text": "## Writing Tests\n\nTesting is a critical part of smart contract development. Hardhat makes it easy to write tests in JavaScript using frameworks like [Mocha](https://mochajs.org/){target=\\_blank} and [Chai](https://www.chaijs.com/){target=\\_blank}.\n\n1. Create a folder for testing called `test`. Inside that directory, create a file named `Storage.js` and add the following code:\n\n ```javascript title=\"Storage.js\" \n const { expect } = require('chai');\n const { ethers } = require('hardhat');\n\n describe('Storage', function () {\n let storage;\n let owner;\n let addr1;\n\n beforeEach(async function () {\n // Get signers\n [owner, addr1] = await ethers.getSigners();\n\n // Deploy the Storage contract\n const Storage = await ethers.getContractFactory('Storage');\n storage = await Storage.deploy();\n await storage.waitForDeployment();\n });\n\n describe('Basic functionality', function () {\n // Add your logic here\n });\n });\n ```\n\n The `beforeEach` hook ensures stateless contract execution by redeploying a fresh instance of the Storage contract before each test case. This approach guarantees that each test starts with a clean and independent contract state by using `ethers.getSigners()` to obtain test accounts and `ethers.getContractFactory('Storage').deploy()` to create a new contract instance.\n\n Now, you can add custom unit tests to check your contract functionality. Some example tests are available below:\n\n 1. **Initial state verification**: Ensures that the contract starts with a default value of zero, which is a fundamental expectation for the `Storage.sol` contract.\n\n ```javascript title=\"Storage.js\"\n it('Should return 0 initially', async function () {\n expect(await storage.retrieve()).to.equal(0);\n });\n ```\n\n Explanation:\n\n - Checks the initial state of the contract.\n - Verifies that a newly deployed contract has a default value of 0.\n - Confirms the `retrieve()` method works correctly for a new contract.\n\n 2. **Value storage test**: Validate the core functionality of storing and retrieving a value in the contract.\n\n ```javascript title=\"Storage.js\"\n it('Should update when store is called', async function () {\n const testValue = 42;\n // Store a value\n await storage.store(testValue);\n // Check if the value was updated\n expect(await storage.retrieve()).to.equal(testValue);\n });\n ```\n\n Explanation:\n\n - Demonstrates the ability to store a specific value.\n - Checks that the stored value can be retrieved correctly.\n - Verifies the basic write and read functionality of the contract.\n\n 3. **Event emission verification**: Confirm that the contract emits the correct event when storing a value, which is crucial for off-chain tracking.\n\n ```javascript title=\"Storage.js\"\n it('Should emit an event when storing a value', async function () {\n const testValue = 100;\n // Check if the NumberChanged event is emitted with the correct value\n await expect(storage.store(testValue))\n .to.emit(storage, 'NumberChanged')\n .withArgs(testValue);\n });\n ```\n\n Explanation:\n\n - Ensures the `NumberChanged` event is emitted during storage.\n - Verifies that the event contains the correct stored value.\n - Validates the contract's event logging mechanism.\n\n 4. **Sequential value storage test**: Check the contract's ability to store multiple values sequentially and maintain the most recent value.\n\n ```javascript title=\"Storage.js\"\n it('Should allow storing sequentially increasing values', async function () {\n const values = [10, 20, 30, 40];\n\n for (const value of values) {\n await storage.store(value);\n expect(await storage.retrieve()).to.equal(value);\n }\n });\n ```\n\n Explanation:\n\n - Verifies that multiple values can be stored in sequence.\n - Confirms that each new store operation updates the contract's state.\n - Demonstrates the contract's ability always to reflect the most recently stored value.\n\n The complete `test/Storage.js` should look like this:\n\n ???--- code \"View complete script\"\n ```javascript title=\"Storage.js\"\n const { expect } = require('chai');\n const { ethers } = require('hardhat');\n\n describe('Storage', function () {\n let storage;\n let owner;\n let addr1;\n\n beforeEach(async function () {\n // Get signers\n [owner, addr1] = await ethers.getSigners();\n\n // Deploy the Storage contract\n const Storage = await ethers.getContractFactory('Storage');\n storage = await Storage.deploy();\n await storage.waitForDeployment();\n });\n\n describe('Basic functionality', function () {\n it('Should return 0 initially', async function () {\n expect(await storage.retrieve()).to.equal(0);\n });\n\n it('Should update when store is called', async function () {\n const testValue = 42;\n // Store a value\n await storage.store(testValue);\n // Check if the value was updated\n expect(await storage.retrieve()).to.equal(testValue);\n });\n\n it('Should emit an event when storing a value', async function () {\n const testValue = 100;\n // Check if the NumberChanged event is emitted with the correct value\n await expect(storage.store(testValue))\n .to.emit(storage, 'NumberChanged')\n .withArgs(testValue);\n });\n\n it('Should allow storing sequentially increasing values', async function () {\n const values = [10, 20, 30, 40];\n\n for (const value of values) {\n await storage.store(value);\n expect(await storage.retrieve()).to.equal(value);\n }\n });\n });\n });\n ```\n\n2. Run the tests:\n\n ```bash\n npx hardhat test\n ```\n\n3. After running the above command, you will see the output showing that all tests have passed:\n\n
\n npx hardhat test\n Storage\n Basic functionality\n ✔ Should return 0 initially\n ✔ Should update when store is called (1126ms)\n ✔ Should emit an event when storing a value (1131ms)\n ✔ Should allow storing sequentially increasing values (12477ms)\n 4 passing (31s) \n
"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 5, "depth": 2, "title": "Deploying with Ignition", "anchor": "deploying-with-ignition", "start_char": 12912, "end_char": 16132, "estimated_token_count": 786, "token_estimator": "heuristic-v1", "text": "## Deploying with Ignition\n\n[Hardhat's Ignition](https://hardhat.org/ignition/docs/getting-started#overview){target=\\_blank} is a deployment system designed to make deployments predictable and manageable. Let's create a deployment script:\n\n1. Create a new folder called`ignition/modules`. Add a new file named `StorageModule.js` with the following logic:\n\n ```javascript title=\"StorageModule.js\"\n const { buildModule } = require('@nomicfoundation/hardhat-ignition/modules');\n\n module.exports = buildModule('StorageModule', (m) => {\n const storage = m.contract('Storage');\n\n return { storage };\n });\n ```\n\n2. Deploy to the local network:\n\n 1. First, start a local node:\n\n ```bash\n npx hardhat node\n ```\n\n 2. Then, in a new terminal window, deploy the contract:\n\n ```bash\n npx hardhat ignition deploy ./ignition/modules/StorageModule.js --network localNode\n ```\n\n 3. If successful, output similar to the following will display in your terminal:\n\n
\n npx hardhat ignition deploy ./ignition/modules/Storage.js --network localNode\n ✔ Confirm deploy to network localNode (420420422)? … yes\n \n Hardhat Ignition 🚀\n \n Deploying [ StorageModule ]\n \n Batch #1\n Executed StorageModule#Storage\n \n [ StorageModule ] successfully deployed 🚀\n \n Deployed Addresses\n \n StorageModule#Storage - 0xc01Ee7f10EA4aF4673cFff62710E1D7792aBa8f3\n
\n\n3. Deploy to the Polkadot Hub TestNet:\n\n 1. Make sure your account has enough PAS tokens for gas fees, then run:\n\n ```bash\n npx hardhat ignition deploy ./ignition/modules/StorageModule.js --network passetHub\n ```\n\n 2. After deployment, you'll see the contract address in the console output. Save this address for future interactions.\n\n
\n npx hardhat ignition deploy ./ignition/modules/Storage.js --network passetHub\n ✔ Confirm deploy to network localNode (420420422)? … yes\n \n Hardhat Ignition 🚀\n \n Deploying [ StorageModule ]\n \n Batch #1\n Executed StorageModule#Storage\n \n [ StorageModule ] successfully deployed 🚀\n \n Deployed Addresses\n \n StorageModule#Storage - 0xE8693cE64b294E26765573398C7Ca5C700E9C85c\n
"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 6, "depth": 2, "title": "Interacting with Your Deployed Contract", "anchor": "interacting-with-your-deployed-contract", "start_char": 16132, "end_char": 18086, "estimated_token_count": 460, "token_estimator": "heuristic-v1", "text": "## Interacting with Your Deployed Contract\n\nTo interact with your deployed contract:\n\n1. Create a new folder named `scripts` and add the `interact.js` with the following content:\n\n ```javascript title=\"interact.js\"\n const hre = require('hardhat');\n\n async function main() {\n // Replace with your deployed contract address\n const contractAddress = 'INSERT_DEPLOYED_CONTRACT_ADDRESS';\n\n // Get the contract instance\n const Storage = await hre.ethers.getContractFactory('Storage');\n const storage = await Storage.attach(contractAddress);\n\n // Get current value\n const currentValue = await storage.retrieve();\n console.log('Current stored value:', currentValue.toString());\n\n // Store a new value\n const newValue = 42;\n console.log(`Storing new value: ${newValue}...`);\n const tx = await storage.store(newValue);\n\n // Wait for transaction to be mined\n await tx.wait();\n console.log('Transaction confirmed');\n\n // Get updated value\n const updatedValue = await storage.retrieve();\n console.log('Updated stored value:', updatedValue.toString());\n }\n\n main()\n .then(() => process.exit(0))\n .catch((error) => {\n console.error(error);\n process.exit(1);\n });\n ```\n\n Ensure that `INSERT_DEPLOYED_CONTRACT_ADDRESS` is replaced with the value obtained in the previous step.\n\n2. Run the interaction script:\n\n ```bash\n npx hardhat run scripts/interact.js --network passetHub\n ```\n\n3. If successful, the terminal will show the following output:\n\n
\n npx hardhat run scripts/interact.js --network passetHub\n Current stored value: 0\n Storing new value: 42...\n Transaction confirmed\n Updated stored value: 42\n
"} +{"page_id": "tutorials-smart-contracts-launch-your-first-project-test-and-deploy-with-hardhat", "page_title": "Test and Deploy with Hardhat", "index": 7, "depth": 2, "title": "Conclusion", "anchor": "conclusion", "start_char": 18086, "end_char": 18691, "estimated_token_count": 122, "token_estimator": "heuristic-v1", "text": "## Conclusion\n\nCongratulations! You've successfully set up a Hardhat development environment, written comprehensive tests for your Storage contract, and deployed it to local and Polkadot Hub TestNet networks. This tutorial covered essential steps in smart contract development, including configuration, testing, deployment, and interaction.\n\nTo get started with a working example right away, you can clone the repository and navigate to the project directory:\n\n```bash\ngit clone https://github.com/polkadot-developers/polkavm-hardhat-examples.git -b v0.0.8\ncd polkavm-hardhat-examples/storage-hardhat\n```"} {"page_id": "tutorials-smart-contracts-launch-your-first-project", "page_title": "Launch Your First Project", "index": 0, "depth": 2, "title": "Development Pathway", "anchor": "development-pathway", "start_char": 873, "end_char": 1162, "estimated_token_count": 65, "token_estimator": "heuristic-v1", "text": "## Development Pathway\n\n- **Beginner-friendly**: Step-by-step instructions suitable for newcomers to smart contract development.\n- **Hands-on learning**: Practical exercises that build real-world skills.\n- **Production-ready**: Progress from basic concepts to deployment-ready contracts."} {"page_id": "tutorials-smart-contracts-launch-your-first-project", "page_title": "Launch Your First Project", "index": 1, "depth": 2, "title": "In This Section", "anchor": "in-this-section", "start_char": 1162, "end_char": 1211, "estimated_token_count": 12, "token_estimator": "heuristic-v1", "text": "## In This Section\n\n:::INSERT_IN_THIS_SECTION:::"} {"page_id": "tutorials-smart-contracts", "page_title": "Smart Contracts", "index": 0, "depth": 2, "title": "What to Expect from These Tutorials", "anchor": "what-to-expect-from-these-tutorials", "start_char": 440, "end_char": 739, "estimated_token_count": 67, "token_estimator": "heuristic-v1", "text": "## What to Expect from These Tutorials\n\n- **Beginner to advanced**: Suitable for developers of all levels.\n- **Complete workflows**: Covers the entire process from writing code to on-chain deployment.\n- **Interactive examples**: Follow along with real, working code that you can modify and expand."}