@@ -59,6 +59,32 @@ const [blockNumber, balance, ensName] = await Promise.all([
5959])
6060```
6161
62+ You can also use the ` waitAsRateLimit ` option to send one batch per ` wait ` milliseconds if the batch size is reached.
63+
64+ ``` ts twoslash
65+ import { createPublicClient , http } from ' viem'
66+ import { mainnet } from ' viem/chains'
67+
68+ const client = createPublicClient ({
69+ chain: mainnet ,
70+ transport: http (' https://1.rpc.thirdweb.com/...' , {
71+ batch: {
72+ batchSize: 3 ,
73+ wait: 100 ,
74+ waitAsRateLimit: true , // [!code focus]
75+ },
76+ }),
77+ })
78+ // ---cut---
79+ // Each batch will be sent at most once every 100 milliseconds.
80+ const [blockNumber, balance, ensName] = await Promise .all ([
81+ client .getBlockNumber (),
82+ client .getBalance ({ address: ' 0xd2135CfB216b74109775236E36d4b433F1DF507B' }),
83+ client .getEnsName ({ address: ' 0xd2135CfB216b74109775236E36d4b433F1DF507B' }),
84+ // client.get....
85+ ]);
86+ ```
87+
6288## Parameters
6389
6490### url (optional)
@@ -123,6 +149,25 @@ const transport = http('https://1.rpc.thirdweb.com/...', {
123149})
124150```
125151
152+ ### batch.waitAsRateLimit (optional)
153+
154+ - ** Type:** ` boolean `
155+ - ** Default:** ` false `
156+
157+ Send one batch per ` wait ` milliseconds if the batch size is reached. By default, multiple batches are sent at once each ` wait ` milliseconds.
158+
159+ Warning: This can lead to a high number of pending requests if the batch size is constantly exceeded without enough time to clear the queue.
160+
161+ ``` ts twoslash
162+ import { http } from ' viem'
163+ // ---cut---
164+ const transport = http (' https://1.rpc.thirdweb.com/...' , {
165+ batch: {
166+ waitAsRateLimit: true // [!code focus]
167+ }
168+ })
169+ ```
170+
126171### fetchOptions (optional)
127172
128173- ** Type:** [ ` RequestInit ` ] ( https://developer.mozilla.org/en-US/docs/Web/API/fetch )
0 commit comments