Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .changeset/monorepo-aware-setup-commands.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'@tanstack/intent': patch
---

Make `edit-package-json` and `add-library-bin` monorepo-aware: when run from a monorepo root, they discover workspace packages containing SKILL.md files and apply changes to each package's package.json. Also improve domain-discovery skill to read in-repo docs before interviewing and avoid asking factual questions the agent can answer by searching the codebase.
29 changes: 24 additions & 5 deletions packages/intent/meta/domain-discovery/SKILL.md
Original file line number Diff line number Diff line change
Expand Up @@ -76,23 +76,33 @@ These rules override any other reasoning. No exceptions.
STOP and WAIT for their reply. Do not answer your own questions. Do
not infer answers from documentation. Do not skip questions because
you believe you already know the answer.
3. **Do not convert open-ended questions into multiple-choice,
3. **Never ask factual questions you can answer by searching the
codebase.** Before asking any question, determine whether the answer
is a deterministic fact (how many X exist, what versions are
supported, which files implement Y) or a judgment call (which ones
matter, what should we prioritize, what do developers struggle with).
Factual questions must be answered by searching the code — grep,
glob, read files. Only ask the maintainer for priorities, opinions,
trade-offs, and implicit knowledge that cannot be found in code or
docs. Asking the maintainer a question whose answer is sitting in
the codebase wastes their time and erodes trust in the process.
4. **Do not convert open-ended questions into multiple-choice,
yes/no, or confirmation prompts.** The question templates in each
sub-section are open-ended by design. Present them as open-ended
questions. The maintainer's unprompted answers surface knowledge that
pre-structured options suppress.
4. **Minimum question counts are enforced.** Each sub-section specifies
5. **Minimum question counts are enforced.** Each sub-section specifies
a question count range (e.g. "2–4 questions"). You must ask at least
the minimum number. Asking zero questions in any sub-section is a
protocol violation.
5. **STOP gates are mandatory.** At the boundaries marked `── STOP ──`
6. **STOP gates are mandatory.** At the boundaries marked `── STOP ──`
below, you must halt execution and wait for the maintainer's response
or acknowledgment before proceeding. Do not continue past a STOP gate
in the same message.
6. **If the maintainer asks to skip an interview phase**, explain the
7. **If the maintainer asks to skip an interview phase**, explain the
value of the phase and what will be lost. Proceed with skipping only
if they confirm a second time.
7. **Rich documentation makes interviews MORE valuable, not less.**
8. **Rich documentation makes interviews MORE valuable, not less.**
When docs are comprehensive, the interview surfaces what docs miss:
implicit knowledge, AI-specific failure modes, undocumented tradeoffs,
and the maintainer's prioritization of what matters most. Never
Expand All @@ -115,6 +125,15 @@ reading exhaustively yet.
4. **AGENTS.md or .cursorrules** — if the library already has agent
guidance, read it. This is high-signal for what the maintainer
considers important
5. **All in-repo documentation** — list every `.md` file in the `docs/`
directory (and any other documentation directories like `guides/`,
`reference/`, `wiki/`). Read every file. This is NOT the exhaustive
external doc reading from Phase 3 — this is reading what the
maintainer committed to the repository, which is fast and
high-signal. In-repo docs often contain migration guides, backward
compatibility notes, architecture decisions, and other context that
prevents you from asking factual questions the docs already answer.
Do not sample a subset — read them all before the first interview.

### 1b — Read peer dependency constraints

Expand Down
8 changes: 4 additions & 4 deletions packages/intent/src/cli.ts
Original file line number Diff line number Diff line change
Expand Up @@ -565,13 +565,13 @@ skills:
break
}
case 'add-library-bin': {
const { runAddLibraryBin } = await import('./setup.js')
runAddLibraryBin(process.cwd())
const { runAddLibraryBinAll } = await import('./setup.js')
runAddLibraryBinAll(process.cwd())
break
}
case 'edit-package-json': {
const { runEditPackageJson } = await import('./setup.js')
runEditPackageJson(process.cwd())
const { runEditPackageJsonAll } = await import('./setup.js')
runEditPackageJsonAll(process.cwd())
break
}
case 'setup-github-actions': {
Expand Down
176 changes: 175 additions & 1 deletion packages/intent/src/setup.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,9 @@ import {
readdirSync,
writeFileSync,
} from 'node:fs'
import { join } from 'node:path'
import { join, relative } from 'node:path'
import { parse as parseYaml } from 'yaml'
import { findSkillFiles } from './utils.js'

// ---------------------------------------------------------------------------
// Types
Expand All @@ -26,6 +28,11 @@ export interface SetupGithubActionsResult {
skipped: Array<string>
}

export interface MonorepoResult<T> {
package: string
result: T
}

interface TemplateVars {
PACKAGE_NAME: string
REPO: string
Expand Down Expand Up @@ -337,6 +344,173 @@ export function runEditPackageJson(root: string): EditPackageJsonResult {
return result
}

// ---------------------------------------------------------------------------
// Monorepo workspace resolution
// ---------------------------------------------------------------------------

function readWorkspacePatterns(root: string): Array<string> | null {
// pnpm-workspace.yaml
const pnpmWs = join(root, 'pnpm-workspace.yaml')
if (existsSync(pnpmWs)) {
try {
const config = parseYaml(readFileSync(pnpmWs, 'utf8')) as Record<
string,
unknown
>
if (Array.isArray(config.packages)) {
return config.packages as Array<string>
}
} catch (err: unknown) {
console.error(
`Warning: failed to parse ${pnpmWs}: ${err instanceof Error ? err.message : err}`,
)
}
}

// package.json workspaces
const pkgPath = join(root, 'package.json')
if (existsSync(pkgPath)) {
try {
const pkg = JSON.parse(readFileSync(pkgPath, 'utf8'))
if (Array.isArray(pkg.workspaces)) {
return pkg.workspaces
}
if (Array.isArray(pkg.workspaces?.packages)) {
return pkg.workspaces.packages
}
} catch (err: unknown) {
console.error(
`Warning: failed to parse ${pkgPath}: ${err instanceof Error ? err.message : err}`,
)
}
}

return null
}

/**
* Resolve workspace glob patterns to actual package directories.
* Handles simple patterns like "packages/*" and "packages/**".
* Each resolved directory must contain a package.json.
*/
function resolveWorkspacePackages(
root: string,
patterns: Array<string>,
): Array<string> {
const dirs: Array<string> = []

for (const pattern of patterns) {
// Strip trailing /* or /**/* for directory resolution
const base = pattern.replace(/\/\*\*?(\/\*)?$/, '')
const baseDir = join(root, base)
if (!existsSync(baseDir)) continue

if (pattern.includes('**')) {
// Recursive: walk all subdirectories
collectPackageDirs(baseDir, dirs)
} else if (pattern.endsWith('/*')) {
// Single level: direct children
for (const entry of readdirSync(baseDir, { withFileTypes: true })) {
if (!entry.isDirectory()) continue
const dir = join(baseDir, entry.name)
if (existsSync(join(dir, 'package.json'))) {
dirs.push(dir)
}
}
} else {
// Exact path
const dir = join(root, pattern)
if (existsSync(join(dir, 'package.json'))) {
dirs.push(dir)
}
}
}

return dirs
}

function collectPackageDirs(dir: string, result: Array<string>): void {
if (existsSync(join(dir, 'package.json'))) {
result.push(dir)
}
let entries: Array<import('node:fs').Dirent>
try {
entries = readdirSync(dir, { withFileTypes: true })
} catch (err: unknown) {
console.error(
`Warning: could not read directory ${dir}: ${err instanceof Error ? err.message : err}`,
)
return
}
for (const entry of entries) {
if (
!entry.isDirectory() ||
entry.name === 'node_modules' ||
entry.name.startsWith('.')
)
continue
collectPackageDirs(join(dir, entry.name), result)
}
}

/**
* Find workspace packages that contain at least one SKILL.md file.
*/
function findPackagesWithSkills(root: string): Array<string> {
const patterns = readWorkspacePatterns(root)
if (!patterns) return []

return resolveWorkspacePackages(root, patterns).filter((dir) => {
const skillsDir = join(dir, 'skills')
return existsSync(skillsDir) && findSkillFiles(skillsDir).length > 0
})
}

// ---------------------------------------------------------------------------
// Monorepo-aware command runner
// ---------------------------------------------------------------------------

/**
* When run from a monorepo root, finds all workspace packages with SKILL.md
* files and runs the given command on each. Falls back to single-package
* behavior only when no workspace config is detected. If workspace config
* exists but no packages have skills, warns and returns empty.
*/
function runForEachPackage<T>(
root: string,
runOne: (dir: string) => T,
): Array<MonorepoResult<T>> | T {
const isMonorepo = readWorkspacePatterns(root) !== null
const pkgsWithSkills = isMonorepo ? findPackagesWithSkills(root) : []

if (!isMonorepo) {
return runOne(root)
}

if (pkgsWithSkills.length === 0) {
console.log('No workspace packages with skills found.')
return []
}

return pkgsWithSkills.map((pkgDir) => {
const rel = relative(root, pkgDir) || '.'
console.log(`\n── ${rel} ──`)
return { package: rel, result: runOne(pkgDir) }
})
}

export function runEditPackageJsonAll(
root: string,
): Array<MonorepoResult<EditPackageJsonResult>> | EditPackageJsonResult {
return runForEachPackage(root, runEditPackageJson)
}

export function runAddLibraryBinAll(
root: string,
): Array<MonorepoResult<AddLibraryBinResult>> | AddLibraryBinResult {
return runForEachPackage(root, runAddLibraryBin)
}

// ---------------------------------------------------------------------------
// Command: setup-github-actions
// ---------------------------------------------------------------------------
Expand Down
Loading
Loading