Skip to content

Add machine-facing agent API and structured DSL ergonomics for deterministic, low-ambiguity workflows#1321

Merged
SkBlaz merged 4 commits intomasterfrom
copilot/improve-py3plex-ergonomics
Mar 31, 2026
Merged

Add machine-facing agent API and structured DSL ergonomics for deterministic, low-ambiguity workflows#1321
SkBlaz merged 4 commits intomasterfrom
copilot/improve-py3plex-ergonomics

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented Mar 29, 2026

  • Inspect latest failing Documentation Examples Validation workflow/job logs
  • Reproduce docs output generation locally and identify remaining nondeterministic diffs
  • Apply minimal deterministic environment normalization in scripts/generate_docs_outputs.py
  • Regenerate examples/docs_outputs and verify clean rerun diff
  • Commit and push changes
  • Reply to PR comment with commit hash

📍 Connect Copilot coding agent with Jira, Azure Boards or Linear to delegate work to Copilot in one click without leaving your project management tool.

Copilot AI linked an issue Mar 29, 2026 that may be closed by this pull request
Copilot AI changed the title [WIP] Improve ergonomics of py3plex for LLM-based agents Add machine-facing agent API and structured DSL ergonomics for deterministic, low-ambiguity workflows Mar 29, 2026
Copilot AI requested a review from SkBlaz March 29, 2026 20:34
@SkBlaz
Copy link
Copy Markdown
Owner

SkBlaz commented Mar 30, 2026

@copilot The job is failing at the “Check for output divergence” step in Documentation Examples Validation because running python scripts/generate_docs_outputs.py in CI produces changes under examples/docs_outputs/, and the workflow enforces that those generated outputs are already committed in the PR.

You can see this directly in the job logs: it runs git diff --exit-code examples/docs_outputs/ and exits with:

  • “Documentation outputs have diverged!”
  • “Please run 'python scripts/generate_docs_outputs.py' locally and commit the changes.”

Solution (what to change in the PR)

  1. Regenerate the docs outputs exactly the way CI does, then commit the updated files.

    • CI uses Python 3.10, creates a venv, installs numpy scipy, then installs the repo editable (pip install -e .), and runs:
      • python scripts/generate_docs_outputs.py

    Locally, do the equivalent (using your preferred venv tooling), for example:

    python3.10 -m venv .venv
    source .venv/bin/activate
    pip install -U pip
    pip install numpy scipy
    pip install -e .
    python scripts/generate_docs_outputs.py
  2. Commit whatever changed in examples/docs_outputs/:

    git status
    git add examples/docs_outputs/
    git commit -m "Update generated docs outputs"
    git push

That should make git diff --exit-code examples/docs_outputs/ clean and the job will pass.

If this keeps happening (stabilize outputs)

The divergence is usually caused by output that changes between environments/runs. The generator already normalizes some volatile values (timestamps, hex addresses, elapsed seconds) in scripts/generate_docs_outputs.py (see normalize_output() in that file at 1128fa3739f1ac1ca267a470842767ea0521d01e/scripts/generate_docs_outputs.py).

If the diff is due to other unstable content (e.g., memory usage, random seeds, ordering differences), extend normalize_output() to normalize that pattern as well. Example (normalize floating-point noise):

# add near the other regexes
_FLOAT_RE = re.compile(r"(\d+\.\d{4})\d+")

# inside normalize_output loop after other substitutions
line = _FLOAT_RE.sub(r"\1...", line)

But for this PR, the required fix is: run the generator and commit the updated examples/docs_outputs/ files.

Copy link
Copy Markdown
Contributor Author

Copilot AI commented Mar 30, 2026

@copilot The job is failing at the “Check for output divergence” step in Documentation Examples Validation because running python scripts/generate_docs_outputs.py in CI produces changes ...

Updated in 383624a. I regenerated docs outputs with a clean venv flow (install deps + editable install) and committed all changes under examples/docs_outputs/, including the new .err artifacts. This should resolve the git diff --exit-code examples/docs_outputs/ divergence check.

@SkBlaz
Copy link
Copy Markdown
Owner

SkBlaz commented Mar 30, 2026

@copilot Job 69269450185 fails in the “Check for output divergence” step because running python scripts/generate_docs_outputs.py in CI produces changes under examples/docs_outputs/, and the workflow then enforces a clean diff:

  • Workflow check (ref 383624a98e482cf281ab52a75a1d9dd93b0973fe): .github/workflows/docs-examples.yml lines 66–74 run git diff --exit-code examples/docs_outputs/ and fail if anything changed.
  • The logs show: “Documentation outputs have diverged!” and instruct to regenerate and commit outputs.

Solution (what to change in the PR)

  1. Regenerate the docs outputs and commit them so examples/docs_outputs/ matches what CI generates.

    python -m venv .venv
    source .venv/bin/activate
    pip install -U pip
    pip install numpy scipy
    pip install -e .
    
    python scripts/generate_docs_outputs.py
    
    git status
    git add examples/docs_outputs/
    git commit -m "Update generated documentation outputs"
    git push
  2. If you already did this but CI still changes files, the generator likely emits non-deterministic content that isn’t being normalized. The generator already normalizes:

    • timestamps (_LOG_TIMESTAMP_RE)
    • hex addresses (_HEX_ADDRESS_RE)
    • elapsed seconds (_ELAPSED_SECONDS_RE)
      in scripts/generate_docs_outputs.py (see normalize_output() around lines 97–113, ref 383624a98e482cf281ab52a75a1d9dd93b0973fe).

    Code suggestion (stabilize remaining common sources of diffs):

    • Force deterministic hashing and locale in the subprocess environment so output ordering/messages don’t vary across machines.
    • Optionally cap threads for numerical libs to reduce run-to-run variance.

    Patch idea inside run_example() where env is built (around lines 69–79):

    env = {
        **subprocess.os.environ,
        "TQDM_DISABLE": "1",
        "PYTHONWARNINGS": "ignore",
        # Add these for determinism:
        "PYTHONHASHSEED": "0",
        "LC_ALL": "C.UTF-8",
        "LANG": "C.UTF-8",
        # Optional: reduce nondeterminism from parallelism
        "OMP_NUM_THREADS": "1",
        "OPENBLAS_NUM_THREADS": "1",
        "MKL_NUM_THREADS": "1",
        "NUMEXPR_NUM_THREADS": "1",
    }

    After applying such a change, re-run python scripts/generate_docs_outputs.py and commit the updated examples/docs_outputs/.

Why this fixes the job

The job is not failing because the examples crashed—they generated outputs successfully. It fails because the PR does not include the updated generated artifacts (or because generation isn’t stable). Committing the regenerated examples/docs_outputs/ (and/or making output generation deterministic) will make git diff --exit-code examples/docs_outputs/ pass.

Copy link
Copy Markdown
Contributor Author

Copilot AI commented Mar 31, 2026

@copilot Job 69269450185 fails in the “Check for output divergence” step because running python scripts/generate_docs_outputs.py in CI produces changes under examples/docs_outputs/, an...

Addressed in d382c10. I applied deterministic generation fixes in scripts/generate_docs_outputs.py (normalized "took X seconds" timing lines and set deterministic subprocess env vars), regenerated examples/docs_outputs/, and committed the updated artifacts. Targeted validation passes with python -m pytest tests/test_docs_examples.py -q.

@SkBlaz SkBlaz marked this pull request as ready for review March 31, 2026 17:43
@SkBlaz SkBlaz merged commit b99629c into master Mar 31, 2026
33 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

ergonomics

2 participants