Skip to content

Replace POSIX standard mention with LLM-friendly features#58

Merged
turlockmike merged 1 commit intomasterfrom
docs/condense-readme
Feb 21, 2026
Merged

Replace POSIX standard mention with LLM-friendly features#58
turlockmike merged 1 commit intomasterfrom
docs/condense-readme

Conversation

@turlockmike
Copy link
Owner

Swap the POSIX Agent Standard badge for a plain description of what makes murl LLM-friendly: compact JSON, NDJSON streaming, structured stderr errors, semantic exit codes.

Generated with Claude Code

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copilot AI review requested due to automatic review settings February 21, 2026 00:01
@turlockmike turlockmike merged commit 84a37eb into master Feb 21, 2026
8 checks passed
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates the README’s positioning to describe concrete agent/LLM-friendly CLI behaviors instead of referencing the POSIX Agent Standard compliance badge.

Changes:

  • Replace the POSIX Agent Standard “Level 2 compliant” badge line with a plain-language description of agent-friendly output/error/exit-code behavior.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

A curl-like CLI for [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers. Query tools, resources, and prompts using simple REST-like URLs.

**[POSIX Agent Standard](https://github.com/turlockmike/posix-agent-standard) Level 2 compliant** — works natively with AI agents.
**LLM-friendly:** compact JSON output, NDJSON streaming, structured errors to stderr, semantic exit codes. Built for agents to call from shell.
Copy link

Copilot AI Feb 21, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The README now says "NDJSON streaming", but the CLI output path emits one JSON object per line only after the full result list is available (it iterates an in-memory list and prints each item). Consider rephrasing to avoid implying true streaming (e.g., "NDJSON (JSON Lines) for list output").

Suggested change
**LLM-friendly:** compact JSON output, NDJSON streaming, structured errors to stderr, semantic exit codes. Built for agents to call from shell.
**LLM-friendly:** compact JSON output, NDJSON (JSON Lines) list output, structured errors to stderr, semantic exit codes. Built for agents to call from shell.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants